Goal: To correctly identify the species of bear given an image of a grizzly, panda, or polar bear
Data Source: Bing Image Search
Data Input: Images of grizzly, panda, and polar bears
Output: Output label for species of bear
Upload data.zip that contains the input images, sorted into labeled folders.
!ls -lh
total 4.0K drwxr-xr-x 1 root root 4.0K Apr 21 13:39 sample_data
!gdown https://drive.google.com/uc?id=1NiOQoOYciSXMD63EBDrb2fPUKRayQgDL
Downloading... From: https://drive.google.com/uc?id=1NiOQoOYciSXMD63EBDrb2fPUKRayQgDL To: /content/bears.zip 472MB [00:03, 128MB/s]
!ls -lh
total 450M -rw-r--r-- 1 root root 450M May 7 01:32 bears.zip drwxr-xr-x 1 root root 4.0K Apr 21 13:39 sample_data
! unzip -q bears.zip
! rm -r sample_data
! ls
bears bears.zip
Show a few representative images.
from tensorflow.keras.preprocessing.image import ImageDataGenerator
image_generator = ImageDataGenerator(rescale=1./255)
training_data = image_generator.flow_from_directory( 'bears/training', target_size=(256, 256), batch_size=9, class_mode='categorical')
print(training_data.image_shape)
Found 727 images belonging to 3 classes. (256, 256, 3)
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
L = 3
plt.figure(figsize=(20,16))
for my_batch in training_data:
images = my_batch[0]
labels = my_batch[1]
for r in range(0,3):
for c in range(0,3):
plt.subplot(L,L,r * L + c + 1)
#plt.axis('off')
#plt.title(labels[r * L + c])
plt.imshow(images[r * L + c])
break
Using all the data, we create a model that reaches close to 100% accuracy. Although we don't want to split the data at this point, we also need to ensure our results are consistent from run to run. For replicability, we will download another file that contains all images from our dataset, randomly sorted into validation and training folders. This random sort is independent of the random sort used to split the dataset that will actually be used for developing the CNN.
!gdown https://drive.google.com/uc?id=1axU8tROnd8H1snOh_nZhtYPjHGICDS7T
Downloading... From: https://drive.google.com/uc?id=1axU8tROnd8H1snOh_nZhtYPjHGICDS7T To: /content/bears_overfit.zip 472MB [00:06, 69.4MB/s]
!ls -lh
total 900M drwxr-xr-x 5 root root 4.0K Apr 15 13:07 bears -rw-r--r-- 1 root root 450M Apr 15 17:43 bears_overfit.zip -rw-r--r-- 1 root root 450M Apr 15 17:42 bears.zip
!unzip -q bears_overfit.zip
!ls -lh
total 900M drwxr-xr-x 5 root root 4.0K Apr 15 13:07 bears -rw-r--r-- 1 root root 450M Apr 15 17:43 bears_overfit.zip -rw-r--r-- 1 root root 450M Apr 15 17:42 bears.zip drwxr-xr-x 4 root root 4.0K Mar 25 16:01 overfit
from tensorflow.keras.preprocessing.image import ImageDataGenerator
all_data = ImageDataGenerator(rescale=1./255)
overfit_train = all_data.flow_from_directory( 'overfit/training', target_size=(256, 256), batch_size=16, class_mode='categorical')
overfit_valid = all_data.flow_from_directory( 'overfit/validation', target_size=(256, 256), batch_size=16, class_mode='categorical')
##For conversion to grayscale:
#color_mode='grayscale',
Found 972 images belonging to 3 classes. Found 241 images belonging to 3 classes.
print(overfit_train.image_shape)
print(overfit_train.n)
print(overfit_valid.image_shape)
print(overfit_valid.n)
(256, 256, 3) 972 (256, 256, 3) 241
To overfit, we will attempt to over-inflate the number of parameters our model needs as the greater the number of parameters, the more likely the model will fit itself directly to the data.
Overfit Model 1:
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Conv2D, MaxPool2D, Flatten
from tensorflow.keras.optimizers import Adam, Adamax
overfit_model_1 = Sequential()
overfit_model_1.add( Conv2D(filters=64, kernel_size=5, activation = 'relu', input_shape = overfit_train.image_shape ) )
overfit_model_1.add( Conv2D(filters=32, kernel_size=5, activation = 'relu' ) )
overfit_model_1.add( Conv2D(filters=16, kernel_size=5, activation = 'relu' ) )
overfit_model_1.add( MaxPool2D(5,5))
overfit_model_1.add( Flatten())
overfit_model_1.add( Dense(units=40, activation = 'relu' ) )
overfit_model_1.add( Dense(units=20, activation = 'relu' ) )
overfit_model_1.add( Dense(units=10, activation = 'relu' ) )
overfit_model_1.add( Dense(units=3, activation = 'softmax' ) )
overfit_model_1.summary()
Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= conv2d (Conv2D) (None, 252, 252, 64) 4864 _________________________________________________________________ conv2d_1 (Conv2D) (None, 248, 248, 32) 51232 _________________________________________________________________ conv2d_2 (Conv2D) (None, 244, 244, 16) 12816 _________________________________________________________________ max_pooling2d (MaxPooling2D) (None, 48, 48, 16) 0 _________________________________________________________________ flatten (Flatten) (None, 36864) 0 _________________________________________________________________ dense (Dense) (None, 40) 1474600 _________________________________________________________________ dense_1 (Dense) (None, 20) 820 _________________________________________________________________ dense_2 (Dense) (None, 10) 210 _________________________________________________________________ dense_3 (Dense) (None, 3) 33 ================================================================= Total params: 1,544,575 Trainable params: 1,544,575 Non-trainable params: 0 _________________________________________________________________
from tensorflow.keras.metrics import Precision, Recall
overfit_model_1.compile( optimizer = 'adam', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
history1 = overfit_model_1.fit( overfit_train, validation_data = overfit_valid, epochs = 100, batch_size = 64 )
Epoch 1/100 61/61 [==============================] - 70s 585ms/step - loss: 1.0836 - accuracy: 0.4398 - precision: 0.5835 - recall: 0.0711 - val_loss: 0.8315 - val_accuracy: 0.6349 - val_precision: 0.7565 - val_recall: 0.3610 Epoch 2/100 61/61 [==============================] - 33s 539ms/step - loss: 0.7675 - accuracy: 0.7163 - precision: 0.7762 - recall: 0.4499 - val_loss: 0.7449 - val_accuracy: 0.7469 - val_precision: 0.7951 - val_recall: 0.4025 Epoch 3/100 61/61 [==============================] - 33s 541ms/step - loss: 0.6091 - accuracy: 0.7985 - precision: 0.8431 - recall: 0.6520 - val_loss: 0.9706 - val_accuracy: 0.5851 - val_precision: 0.6199 - val_recall: 0.5685 Epoch 4/100 61/61 [==============================] - 33s 535ms/step - loss: 0.2821 - accuracy: 0.9201 - precision: 0.9258 - recall: 0.8888 - val_loss: 0.5450 - val_accuracy: 0.8174 - val_precision: 0.8376 - val_recall: 0.8133 Epoch 5/100 61/61 [==============================] - 32s 533ms/step - loss: 0.0994 - accuracy: 0.9781 - precision: 0.9781 - recall: 0.9770 - val_loss: 0.8386 - val_accuracy: 0.8008 - val_precision: 0.8120 - val_recall: 0.7884 Epoch 6/100 61/61 [==============================] - 33s 547ms/step - loss: 0.0519 - accuracy: 0.9793 - precision: 0.9806 - recall: 0.9792 - val_loss: 1.0375 - val_accuracy: 0.7593 - val_precision: 0.7627 - val_recall: 0.7469 Epoch 7/100 61/61 [==============================] - 33s 533ms/step - loss: 0.0612 - accuracy: 0.9789 - precision: 0.9798 - recall: 0.9780 - val_loss: 1.2306 - val_accuracy: 0.6929 - val_precision: 0.6979 - val_recall: 0.6805 Epoch 8/100 61/61 [==============================] - 33s 539ms/step - loss: 0.0696 - accuracy: 0.9797 - precision: 0.9797 - recall: 0.9786 - val_loss: 0.8603 - val_accuracy: 0.7801 - val_precision: 0.7824 - val_recall: 0.7759 Epoch 9/100 61/61 [==============================] - 33s 541ms/step - loss: 0.0401 - accuracy: 0.9888 - precision: 0.9888 - recall: 0.9888 - val_loss: 0.8078 - val_accuracy: 0.8133 - val_precision: 0.8193 - val_recall: 0.8091 Epoch 10/100 61/61 [==============================] - 33s 543ms/step - loss: 0.0053 - accuracy: 0.9987 - precision: 0.9987 - recall: 0.9987 - val_loss: 0.9523 - val_accuracy: 0.8216 - val_precision: 0.8208 - val_recall: 0.8174 Epoch 11/100 61/61 [==============================] - 33s 540ms/step - loss: 9.5008e-04 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.0723 - val_accuracy: 0.8174 - val_precision: 0.8201 - val_recall: 0.8133 Epoch 12/100 61/61 [==============================] - 33s 536ms/step - loss: 2.8662e-04 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.0481 - val_accuracy: 0.8174 - val_precision: 0.8201 - val_recall: 0.8133 Epoch 13/100 61/61 [==============================] - 33s 541ms/step - loss: 2.0325e-04 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.0901 - val_accuracy: 0.8216 - val_precision: 0.8243 - val_recall: 0.8174 Epoch 14/100 61/61 [==============================] - 33s 536ms/step - loss: 1.3999e-04 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.1086 - val_accuracy: 0.8216 - val_precision: 0.8201 - val_recall: 0.8133 Epoch 15/100 61/61 [==============================] - 33s 547ms/step - loss: 8.1855e-05 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.1265 - val_accuracy: 0.8216 - val_precision: 0.8208 - val_recall: 0.8174 Epoch 16/100 61/61 [==============================] - 33s 548ms/step - loss: 9.4056e-05 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.1451 - val_accuracy: 0.8257 - val_precision: 0.8257 - val_recall: 0.8257 Epoch 17/100 61/61 [==============================] - 33s 549ms/step - loss: 5.4169e-05 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.1661 - val_accuracy: 0.8257 - val_precision: 0.8250 - val_recall: 0.8216 Epoch 18/100 61/61 [==============================] - 34s 553ms/step - loss: 4.5913e-05 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.1846 - val_accuracy: 0.8216 - val_precision: 0.8216 - val_recall: 0.8216 Epoch 19/100 61/61 [==============================] - 33s 548ms/step - loss: 5.3695e-05 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.1992 - val_accuracy: 0.8133 - val_precision: 0.8133 - val_recall: 0.8133 Epoch 20/100 61/61 [==============================] - 33s 536ms/step - loss: 2.9078e-05 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.2153 - val_accuracy: 0.8174 - val_precision: 0.8167 - val_recall: 0.8133 Epoch 21/100 61/61 [==============================] - 33s 542ms/step - loss: 2.0687e-05 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.2330 - val_accuracy: 0.8050 - val_precision: 0.8083 - val_recall: 0.8050 Epoch 22/100 61/61 [==============================] - 34s 551ms/step - loss: 3.9685e-05 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.2527 - val_accuracy: 0.8008 - val_precision: 0.8008 - val_recall: 0.8008 Epoch 23/100 61/61 [==============================] - 34s 557ms/step - loss: 2.6452e-05 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.2644 - val_accuracy: 0.8008 - val_precision: 0.8008 - val_recall: 0.8008 Epoch 24/100 61/61 [==============================] - 33s 540ms/step - loss: 2.6337e-05 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.2797 - val_accuracy: 0.8050 - val_precision: 0.8050 - val_recall: 0.8050 Epoch 25/100 61/61 [==============================] - 32s 527ms/step - loss: 2.1965e-05 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.2964 - val_accuracy: 0.8050 - val_precision: 0.8050 - val_recall: 0.8050 Epoch 26/100 61/61 [==============================] - 33s 536ms/step - loss: 1.9257e-05 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.3070 - val_accuracy: 0.8008 - val_precision: 0.8008 - val_recall: 0.8008 Epoch 27/100 61/61 [==============================] - 32s 532ms/step - loss: 1.1916e-05 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.3194 - val_accuracy: 0.8008 - val_precision: 0.8008 - val_recall: 0.8008 Epoch 28/100 61/61 [==============================] - 32s 530ms/step - loss: 1.1196e-05 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.3273 - val_accuracy: 0.8050 - val_precision: 0.8050 - val_recall: 0.8050 Epoch 29/100 61/61 [==============================] - 32s 534ms/step - loss: 1.3915e-05 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.3395 - val_accuracy: 0.8050 - val_precision: 0.8050 - val_recall: 0.8050 Epoch 30/100 61/61 [==============================] - 32s 530ms/step - loss: 9.1691e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.3499 - val_accuracy: 0.8050 - val_precision: 0.8050 - val_recall: 0.8050 Epoch 31/100 61/61 [==============================] - 32s 530ms/step - loss: 1.9265e-05 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.3607 - val_accuracy: 0.8050 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 32/100 61/61 [==============================] - 32s 530ms/step - loss: 1.0686e-05 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.3686 - val_accuracy: 0.8050 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 33/100 61/61 [==============================] - 32s 534ms/step - loss: 5.8942e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.3762 - val_accuracy: 0.8050 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 34/100 61/61 [==============================] - 32s 531ms/step - loss: 6.4961e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.3880 - val_accuracy: 0.8050 - val_precision: 0.8033 - val_recall: 0.7967 Epoch 35/100 61/61 [==============================] - 33s 529ms/step - loss: 7.6968e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.3966 - val_accuracy: 0.8050 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 36/100 61/61 [==============================] - 33s 534ms/step - loss: 7.8806e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.4091 - val_accuracy: 0.8050 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 37/100 61/61 [==============================] - 33s 542ms/step - loss: 7.6208e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.4159 - val_accuracy: 0.8050 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 38/100 61/61 [==============================] - 33s 535ms/step - loss: 6.3514e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.4217 - val_accuracy: 0.8050 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 39/100 61/61 [==============================] - 33s 536ms/step - loss: 3.9307e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.4303 - val_accuracy: 0.8091 - val_precision: 0.8083 - val_recall: 0.8050 Epoch 40/100 61/61 [==============================] - 32s 536ms/step - loss: 5.9779e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.4395 - val_accuracy: 0.8091 - val_precision: 0.8083 - val_recall: 0.8050 Epoch 41/100 61/61 [==============================] - 32s 535ms/step - loss: 4.6769e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.4493 - val_accuracy: 0.8091 - val_precision: 0.8083 - val_recall: 0.8050 Epoch 42/100 61/61 [==============================] - 33s 533ms/step - loss: 3.3986e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.4559 - val_accuracy: 0.8091 - val_precision: 0.8083 - val_recall: 0.8050 Epoch 43/100 61/61 [==============================] - 33s 536ms/step - loss: 3.5326e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.4637 - val_accuracy: 0.8091 - val_precision: 0.8083 - val_recall: 0.8050 Epoch 44/100 61/61 [==============================] - 33s 539ms/step - loss: 3.2545e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.4716 - val_accuracy: 0.8050 - val_precision: 0.8083 - val_recall: 0.8050 Epoch 45/100 61/61 [==============================] - 33s 538ms/step - loss: 3.1308e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.4805 - val_accuracy: 0.8050 - val_precision: 0.8083 - val_recall: 0.8050 Epoch 46/100 61/61 [==============================] - 34s 548ms/step - loss: 3.7517e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.4881 - val_accuracy: 0.8050 - val_precision: 0.8083 - val_recall: 0.8050 Epoch 47/100 61/61 [==============================] - 33s 541ms/step - loss: 3.8892e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.4954 - val_accuracy: 0.8050 - val_precision: 0.8083 - val_recall: 0.8050 Epoch 48/100 61/61 [==============================] - 33s 545ms/step - loss: 2.0808e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.4995 - val_accuracy: 0.8050 - val_precision: 0.8083 - val_recall: 0.8050 Epoch 49/100 61/61 [==============================] - 33s 547ms/step - loss: 2.9188e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.5075 - val_accuracy: 0.8050 - val_precision: 0.8083 - val_recall: 0.8050 Epoch 50/100 61/61 [==============================] - 33s 544ms/step - loss: 3.4878e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.5084 - val_accuracy: 0.8008 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 51/100 61/61 [==============================] - 33s 540ms/step - loss: 3.3632e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.5183 - val_accuracy: 0.8008 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 52/100 61/61 [==============================] - 33s 535ms/step - loss: 2.0489e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.5243 - val_accuracy: 0.8008 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 53/100 61/61 [==============================] - 33s 536ms/step - loss: 1.7841e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.5294 - val_accuracy: 0.8008 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 54/100 61/61 [==============================] - 33s 547ms/step - loss: 2.2677e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.5372 - val_accuracy: 0.8008 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 55/100 61/61 [==============================] - 33s 538ms/step - loss: 2.3805e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.5434 - val_accuracy: 0.8008 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 56/100 61/61 [==============================] - 33s 541ms/step - loss: 2.0296e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.5499 - val_accuracy: 0.8008 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 57/100 61/61 [==============================] - 33s 544ms/step - loss: 2.1830e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.5567 - val_accuracy: 0.8008 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 58/100 61/61 [==============================] - 33s 538ms/step - loss: 1.5379e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.5625 - val_accuracy: 0.8008 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 59/100 61/61 [==============================] - 33s 535ms/step - loss: 1.4781e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.5709 - val_accuracy: 0.8008 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 60/100 61/61 [==============================] - 33s 543ms/step - loss: 1.5643e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.5758 - val_accuracy: 0.8008 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 61/100 61/61 [==============================] - 33s 543ms/step - loss: 2.2645e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.5831 - val_accuracy: 0.8008 - val_precision: 0.8042 - val_recall: 0.8008 Epoch 62/100 61/61 [==============================] - 33s 541ms/step - loss: 1.6172e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.5888 - val_accuracy: 0.8008 - val_precision: 0.8008 - val_recall: 0.8008 Epoch 63/100 61/61 [==============================] - 33s 542ms/step - loss: 1.2604e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.5929 - val_accuracy: 0.8008 - val_precision: 0.8008 - val_recall: 0.8008 Epoch 64/100 61/61 [==============================] - 33s 542ms/step - loss: 2.0770e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.5999 - val_accuracy: 0.8008 - val_precision: 0.8008 - val_recall: 0.8008 Epoch 65/100 61/61 [==============================] - 33s 540ms/step - loss: 8.5733e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.6053 - val_accuracy: 0.8008 - val_precision: 0.8008 - val_recall: 0.8008 Epoch 66/100 61/61 [==============================] - 33s 539ms/step - loss: 1.0103e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.6108 - val_accuracy: 0.8008 - val_precision: 0.8008 - val_recall: 0.8008 Epoch 67/100 61/61 [==============================] - 33s 537ms/step - loss: 1.0828e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.6159 - val_accuracy: 0.7967 - val_precision: 0.7967 - val_recall: 0.7967 Epoch 68/100 61/61 [==============================] - 33s 535ms/step - loss: 1.0730e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.6222 - val_accuracy: 0.7967 - val_precision: 0.7967 - val_recall: 0.7967 Epoch 69/100 61/61 [==============================] - 32s 534ms/step - loss: 1.0664e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.6289 - val_accuracy: 0.7967 - val_precision: 0.7967 - val_recall: 0.7967 Epoch 70/100 61/61 [==============================] - 33s 536ms/step - loss: 8.4930e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.6334 - val_accuracy: 0.7967 - val_precision: 0.7967 - val_recall: 0.7967 Epoch 71/100 61/61 [==============================] - 32s 535ms/step - loss: 1.1287e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.6407 - val_accuracy: 0.7967 - val_precision: 0.7967 - val_recall: 0.7967 Epoch 72/100 61/61 [==============================] - 33s 538ms/step - loss: 8.7510e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.6444 - val_accuracy: 0.7967 - val_precision: 0.7967 - val_recall: 0.7967 Epoch 73/100 61/61 [==============================] - 33s 534ms/step - loss: 1.0170e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.6505 - val_accuracy: 0.7967 - val_precision: 0.7967 - val_recall: 0.7967 Epoch 74/100 61/61 [==============================] - 33s 540ms/step - loss: 6.7694e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.6559 - val_accuracy: 0.7967 - val_precision: 0.7967 - val_recall: 0.7967 Epoch 75/100 61/61 [==============================] - 33s 546ms/step - loss: 6.5611e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.6622 - val_accuracy: 0.7967 - val_precision: 0.7967 - val_recall: 0.7967 Epoch 76/100 61/61 [==============================] - 34s 556ms/step - loss: 9.2012e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.6671 - val_accuracy: 0.7967 - val_precision: 0.7967 - val_recall: 0.7967 Epoch 77/100 61/61 [==============================] - 33s 535ms/step - loss: 5.4597e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.6737 - val_accuracy: 0.7967 - val_precision: 0.7967 - val_recall: 0.7967 Epoch 78/100 61/61 [==============================] - 33s 534ms/step - loss: 1.0838e-06 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.6778 - val_accuracy: 0.7967 - val_precision: 0.7967 - val_recall: 0.7967 Epoch 79/100 61/61 [==============================] - 33s 534ms/step - loss: 4.6649e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.6829 - val_accuracy: 0.7967 - val_precision: 0.7967 - val_recall: 0.7967 Epoch 80/100 61/61 [==============================] - 32s 530ms/step - loss: 6.5070e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.6875 - val_accuracy: 0.7967 - val_precision: 0.7967 - val_recall: 0.7967 Epoch 81/100 61/61 [==============================] - 32s 530ms/step - loss: 5.3509e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.6925 - val_accuracy: 0.7967 - val_precision: 0.7967 - val_recall: 0.7967 Epoch 82/100 61/61 [==============================] - 32s 532ms/step - loss: 6.9707e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.6987 - val_accuracy: 0.7967 - val_precision: 0.7958 - val_recall: 0.7925 Epoch 83/100 61/61 [==============================] - 32s 534ms/step - loss: 4.8535e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.7030 - val_accuracy: 0.7925 - val_precision: 0.7917 - val_recall: 0.7884 Epoch 84/100 61/61 [==============================] - 32s 532ms/step - loss: 7.8121e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.7094 - val_accuracy: 0.7925 - val_precision: 0.7925 - val_recall: 0.7925 Epoch 85/100 61/61 [==============================] - 32s 528ms/step - loss: 4.2324e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.7137 - val_accuracy: 0.7925 - val_precision: 0.7925 - val_recall: 0.7925 Epoch 86/100 61/61 [==============================] - 32s 534ms/step - loss: 4.4222e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.7185 - val_accuracy: 0.7925 - val_precision: 0.7917 - val_recall: 0.7884 Epoch 87/100 61/61 [==============================] - 32s 530ms/step - loss: 4.0407e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.7230 - val_accuracy: 0.7884 - val_precision: 0.7917 - val_recall: 0.7884 Epoch 88/100 61/61 [==============================] - 32s 530ms/step - loss: 4.5786e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.7284 - val_accuracy: 0.7884 - val_precision: 0.7917 - val_recall: 0.7884 Epoch 89/100 61/61 [==============================] - 32s 528ms/step - loss: 5.2574e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.7349 - val_accuracy: 0.7884 - val_precision: 0.7908 - val_recall: 0.7842 Epoch 90/100 61/61 [==============================] - 32s 523ms/step - loss: 4.1009e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.7391 - val_accuracy: 0.7925 - val_precision: 0.7908 - val_recall: 0.7842 Epoch 91/100 61/61 [==============================] - 32s 533ms/step - loss: 4.3406e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.7433 - val_accuracy: 0.7884 - val_precision: 0.7908 - val_recall: 0.7842 Epoch 92/100 61/61 [==============================] - 33s 538ms/step - loss: 6.9420e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.7476 - val_accuracy: 0.7925 - val_precision: 0.7908 - val_recall: 0.7842 Epoch 93/100 61/61 [==============================] - 32s 529ms/step - loss: 3.9428e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.7537 - val_accuracy: 0.7925 - val_precision: 0.7950 - val_recall: 0.7884 Epoch 94/100 61/61 [==============================] - 32s 530ms/step - loss: 3.6046e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.7580 - val_accuracy: 0.7884 - val_precision: 0.7950 - val_recall: 0.7884 Epoch 95/100 61/61 [==============================] - 32s 531ms/step - loss: 2.4001e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.7633 - val_accuracy: 0.7925 - val_precision: 0.7950 - val_recall: 0.7884 Epoch 96/100 61/61 [==============================] - 32s 531ms/step - loss: 4.2224e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.7660 - val_accuracy: 0.7884 - val_precision: 0.7917 - val_recall: 0.7884 Epoch 97/100 61/61 [==============================] - 32s 526ms/step - loss: 2.8863e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.7716 - val_accuracy: 0.7884 - val_precision: 0.7917 - val_recall: 0.7884 Epoch 98/100 61/61 [==============================] - 32s 529ms/step - loss: 3.5407e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.7764 - val_accuracy: 0.7884 - val_precision: 0.7917 - val_recall: 0.7884 Epoch 99/100 61/61 [==============================] - 32s 520ms/step - loss: 3.5908e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.7816 - val_accuracy: 0.7884 - val_precision: 0.7884 - val_recall: 0.7884 Epoch 100/100 61/61 [==============================] - 32s 524ms/step - loss: 2.1256e-07 - accuracy: 1.0000 - precision: 1.0000 - recall: 1.0000 - val_loss: 1.7863 - val_accuracy: 0.7884 - val_precision: 0.7917 - val_recall: 0.7884
Plot learning curves
plt.subplots_adjust(right=2.1, left=.09)
plt.subplot(1,3,1)
plt.plot(history1.history['accuracy'])
plt.plot(history1.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(history1.history['precision'])
plt.plot(history1.history['val_precision'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
#plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,3)
plt.plot(history1.history['recall'])
plt.plot(history1.history['val_recall'])
plt.ylabel('Recall')
plt.xlabel('')
#plt.legend(['training','validation'], loc="lower right")
plt.show()
Overfit Model 2:
overfit_model_2 = Sequential()
overfit_model_2.add( Conv2D(filters=64, kernel_size=3, activation = 'relu', input_shape = overfit_train.image_shape ) )
for i in range(6):
overfit_model_2.add( Conv2D(filters=32, kernel_size=3, activation = 'relu' ) )
overfit_model_2.add( Conv2D(filters=16, kernel_size=3, activation = 'relu' ) )
overfit_model_2.add( MaxPool2D(5,5))
overfit_model_2.add( Flatten())
overfit_model_2.add( Dense(units=80, activation = 'relu' ) )
overfit_model_2.add( Dense(units=40, activation = 'relu' ) )
overfit_model_2.add( Dense(units=20, activation = 'relu' ) )
overfit_model_2.add( Dense(units=10, activation = 'relu' ) )
overfit_model_2.add( Dense(units=3, activation = 'softmax' ) )
overfit_model_2.summary()
Model: "sequential_1" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= conv2d_3 (Conv2D) (None, 254, 254, 64) 1792 _________________________________________________________________ conv2d_4 (Conv2D) (None, 252, 252, 32) 18464 _________________________________________________________________ conv2d_5 (Conv2D) (None, 250, 250, 32) 9248 _________________________________________________________________ conv2d_6 (Conv2D) (None, 248, 248, 32) 9248 _________________________________________________________________ conv2d_7 (Conv2D) (None, 246, 246, 32) 9248 _________________________________________________________________ conv2d_8 (Conv2D) (None, 244, 244, 32) 9248 _________________________________________________________________ conv2d_9 (Conv2D) (None, 242, 242, 32) 9248 _________________________________________________________________ conv2d_10 (Conv2D) (None, 240, 240, 16) 4624 _________________________________________________________________ max_pooling2d_1 (MaxPooling2 (None, 48, 48, 16) 0 _________________________________________________________________ flatten_1 (Flatten) (None, 36864) 0 _________________________________________________________________ dense_4 (Dense) (None, 80) 2949200 _________________________________________________________________ dense_5 (Dense) (None, 40) 3240 _________________________________________________________________ dense_6 (Dense) (None, 20) 820 _________________________________________________________________ dense_7 (Dense) (None, 10) 210 _________________________________________________________________ dense_8 (Dense) (None, 3) 33 ================================================================= Total params: 3,024,623 Trainable params: 3,024,623 Non-trainable params: 0 _________________________________________________________________
overfit_model_2.compile( optimizer = 'adam', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
history2 = overfit_model_2.fit( overfit_train, validation_data = overfit_valid, epochs = 50, batch_size = 64 )
Epoch 1/50 61/61 [==============================] - 44s 644ms/step - loss: 1.0032 - accuracy: 0.4756 - precision_1: 0.7077 - recall_1: 0.1934 - val_loss: 0.6647 - val_accuracy: 0.7261 - val_precision_1: 0.8053 - val_recall_1: 0.6349 Epoch 2/50 61/61 [==============================] - 34s 567ms/step - loss: 0.6619 - accuracy: 0.7190 - precision_1: 0.7878 - recall_1: 0.6507 - val_loss: 0.5632 - val_accuracy: 0.7469 - val_precision_1: 0.7955 - val_recall_1: 0.7261 Epoch 3/50 61/61 [==============================] - 35s 573ms/step - loss: 0.5447 - accuracy: 0.8054 - precision_1: 0.8188 - recall_1: 0.7651 - val_loss: 0.5314 - val_accuracy: 0.7842 - val_precision_1: 0.7949 - val_recall_1: 0.7718 Epoch 4/50 61/61 [==============================] - 35s 565ms/step - loss: 0.5080 - accuracy: 0.7988 - precision_1: 0.8169 - recall_1: 0.7683 - val_loss: 0.5000 - val_accuracy: 0.8174 - val_precision_1: 0.8405 - val_recall_1: 0.8091 Epoch 5/50 61/61 [==============================] - 35s 573ms/step - loss: 0.4224 - accuracy: 0.8585 - precision_1: 0.8681 - recall_1: 0.8492 - val_loss: 0.4659 - val_accuracy: 0.8091 - val_precision_1: 0.8319 - val_recall_1: 0.8008 Epoch 6/50 61/61 [==============================] - 35s 572ms/step - loss: 0.3344 - accuracy: 0.8751 - precision_1: 0.8879 - recall_1: 0.8669 - val_loss: 1.0281 - val_accuracy: 0.6929 - val_precision_1: 0.6946 - val_recall_1: 0.6888 Epoch 7/50 61/61 [==============================] - 34s 557ms/step - loss: 0.3011 - accuracy: 0.8796 - precision_1: 0.8925 - recall_1: 0.8702 - val_loss: 0.5055 - val_accuracy: 0.8174 - val_precision_1: 0.8305 - val_recall_1: 0.8133 Epoch 8/50 61/61 [==============================] - 34s 554ms/step - loss: 0.1995 - accuracy: 0.9204 - precision_1: 0.9260 - recall_1: 0.9162 - val_loss: 0.5745 - val_accuracy: 0.8257 - val_precision_1: 0.8319 - val_recall_1: 0.8216 Epoch 9/50 61/61 [==============================] - 34s 556ms/step - loss: 0.1112 - accuracy: 0.9599 - precision_1: 0.9598 - recall_1: 0.9572 - val_loss: 0.6508 - val_accuracy: 0.8133 - val_precision_1: 0.8117 - val_recall_1: 0.8050 Epoch 10/50 61/61 [==============================] - 35s 566ms/step - loss: 0.1053 - accuracy: 0.9590 - precision_1: 0.9664 - recall_1: 0.9586 - val_loss: 0.5990 - val_accuracy: 0.8174 - val_precision_1: 0.8228 - val_recall_1: 0.8091 Epoch 11/50 61/61 [==============================] - 34s 547ms/step - loss: 0.0454 - accuracy: 0.9876 - precision_1: 0.9876 - recall_1: 0.9876 - val_loss: 0.6489 - val_accuracy: 0.8465 - val_precision_1: 0.8494 - val_recall_1: 0.8423 Epoch 12/50 61/61 [==============================] - 34s 567ms/step - loss: 0.0378 - accuracy: 0.9921 - precision_1: 0.9950 - recall_1: 0.9913 - val_loss: 0.6363 - val_accuracy: 0.8340 - val_precision_1: 0.8410 - val_recall_1: 0.8340 Epoch 13/50 61/61 [==============================] - 34s 554ms/step - loss: 0.0666 - accuracy: 0.9792 - precision_1: 0.9824 - recall_1: 0.9784 - val_loss: 1.0287 - val_accuracy: 0.8506 - val_precision_1: 0.8500 - val_recall_1: 0.8465 Epoch 14/50 61/61 [==============================] - 34s 564ms/step - loss: 0.0326 - accuracy: 0.9889 - precision_1: 0.9890 - recall_1: 0.9883 - val_loss: 0.5171 - val_accuracy: 0.8465 - val_precision_1: 0.8494 - val_recall_1: 0.8423 Epoch 15/50 61/61 [==============================] - 34s 557ms/step - loss: 0.0428 - accuracy: 0.9843 - precision_1: 0.9845 - recall_1: 0.9843 - val_loss: 0.6892 - val_accuracy: 0.8465 - val_precision_1: 0.8487 - val_recall_1: 0.8382 Epoch 16/50 61/61 [==============================] - 34s 561ms/step - loss: 0.0068 - accuracy: 0.9973 - precision_1: 0.9973 - recall_1: 0.9973 - val_loss: 0.7506 - val_accuracy: 0.8797 - val_precision_1: 0.8797 - val_recall_1: 0.8797 Epoch 17/50 61/61 [==============================] - 34s 552ms/step - loss: 0.0088 - accuracy: 0.9968 - precision_1: 0.9968 - recall_1: 0.9965 - val_loss: 1.0222 - val_accuracy: 0.8299 - val_precision_1: 0.8299 - val_recall_1: 0.8299 Epoch 18/50 61/61 [==============================] - 34s 559ms/step - loss: 0.0022 - accuracy: 0.9999 - precision_1: 0.9999 - recall_1: 0.9999 - val_loss: 1.2813 - val_accuracy: 0.8465 - val_precision_1: 0.8465 - val_recall_1: 0.8465 Epoch 19/50 61/61 [==============================] - 34s 559ms/step - loss: 0.1962 - accuracy: 0.9634 - precision_1: 0.9634 - recall_1: 0.9634 - val_loss: 1.1266 - val_accuracy: 0.7967 - val_precision_1: 0.8025 - val_recall_1: 0.7925 Epoch 20/50 61/61 [==============================] - 34s 549ms/step - loss: 0.0332 - accuracy: 0.9870 - precision_1: 0.9893 - recall_1: 0.9870 - val_loss: 1.0725 - val_accuracy: 0.8299 - val_precision_1: 0.8299 - val_recall_1: 0.8299 Epoch 21/50 61/61 [==============================] - 34s 556ms/step - loss: 0.0689 - accuracy: 0.9816 - precision_1: 0.9819 - recall_1: 0.9804 - val_loss: 0.8506 - val_accuracy: 0.7884 - val_precision_1: 0.7983 - val_recall_1: 0.7884 Epoch 22/50 61/61 [==============================] - 34s 557ms/step - loss: 0.0510 - accuracy: 0.9852 - precision_1: 0.9852 - recall_1: 0.9852 - val_loss: 0.8087 - val_accuracy: 0.8299 - val_precision_1: 0.8285 - val_recall_1: 0.8216 Epoch 23/50 61/61 [==============================] - 34s 557ms/step - loss: 0.0070 - accuracy: 0.9980 - precision_1: 0.9991 - recall_1: 0.9980 - val_loss: 0.8637 - val_accuracy: 0.8382 - val_precision_1: 0.8452 - val_recall_1: 0.8382 Epoch 24/50 61/61 [==============================] - 34s 564ms/step - loss: 6.7998e-04 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 0.9113 - val_accuracy: 0.8506 - val_precision_1: 0.8506 - val_recall_1: 0.8506 Epoch 25/50 61/61 [==============================] - 35s 564ms/step - loss: 3.9589e-04 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 0.9562 - val_accuracy: 0.8423 - val_precision_1: 0.8423 - val_recall_1: 0.8423 Epoch 26/50 61/61 [==============================] - 34s 562ms/step - loss: 2.1785e-04 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 0.9972 - val_accuracy: 0.8423 - val_precision_1: 0.8423 - val_recall_1: 0.8423 Epoch 27/50 61/61 [==============================] - 34s 554ms/step - loss: 1.8357e-04 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.0254 - val_accuracy: 0.8465 - val_precision_1: 0.8465 - val_recall_1: 0.8465 Epoch 28/50 61/61 [==============================] - 34s 562ms/step - loss: 6.7443e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.0559 - val_accuracy: 0.8465 - val_precision_1: 0.8465 - val_recall_1: 0.8465 Epoch 29/50 61/61 [==============================] - 35s 568ms/step - loss: 1.8525e-04 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.0829 - val_accuracy: 0.8465 - val_precision_1: 0.8465 - val_recall_1: 0.8465 Epoch 30/50 61/61 [==============================] - 35s 570ms/step - loss: 7.4207e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.1045 - val_accuracy: 0.8465 - val_precision_1: 0.8465 - val_recall_1: 0.8465 Epoch 31/50 61/61 [==============================] - 34s 556ms/step - loss: 5.4180e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.1159 - val_accuracy: 0.8465 - val_precision_1: 0.8465 - val_recall_1: 0.8465 Epoch 32/50 61/61 [==============================] - 34s 560ms/step - loss: 7.3179e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.1458 - val_accuracy: 0.8465 - val_precision_1: 0.8465 - val_recall_1: 0.8465 Epoch 33/50 61/61 [==============================] - 34s 558ms/step - loss: 2.9496e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.1720 - val_accuracy: 0.8423 - val_precision_1: 0.8423 - val_recall_1: 0.8423 Epoch 34/50 61/61 [==============================] - 34s 555ms/step - loss: 4.9179e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.1789 - val_accuracy: 0.8465 - val_precision_1: 0.8465 - val_recall_1: 0.8465 Epoch 35/50 61/61 [==============================] - 34s 555ms/step - loss: 3.4264e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.1938 - val_accuracy: 0.8465 - val_precision_1: 0.8458 - val_recall_1: 0.8423 Epoch 36/50 61/61 [==============================] - 34s 551ms/step - loss: 2.9725e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.2180 - val_accuracy: 0.8423 - val_precision_1: 0.8423 - val_recall_1: 0.8423 Epoch 37/50 61/61 [==============================] - 34s 547ms/step - loss: 2.9341e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.2335 - val_accuracy: 0.8423 - val_precision_1: 0.8423 - val_recall_1: 0.8423 Epoch 38/50 61/61 [==============================] - 33s 545ms/step - loss: 1.4333e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.2459 - val_accuracy: 0.8423 - val_precision_1: 0.8417 - val_recall_1: 0.8382 Epoch 39/50 61/61 [==============================] - 34s 550ms/step - loss: 1.1315e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.2572 - val_accuracy: 0.8423 - val_precision_1: 0.8417 - val_recall_1: 0.8382 Epoch 40/50 61/61 [==============================] - 33s 545ms/step - loss: 1.2249e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.2731 - val_accuracy: 0.8423 - val_precision_1: 0.8417 - val_recall_1: 0.8382 Epoch 41/50 61/61 [==============================] - 34s 550ms/step - loss: 3.2091e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.2885 - val_accuracy: 0.8423 - val_precision_1: 0.8417 - val_recall_1: 0.8382 Epoch 42/50 61/61 [==============================] - 34s 549ms/step - loss: 1.2716e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.3036 - val_accuracy: 0.8423 - val_precision_1: 0.8458 - val_recall_1: 0.8423 Epoch 43/50 61/61 [==============================] - 33s 550ms/step - loss: 1.6523e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.3094 - val_accuracy: 0.8423 - val_precision_1: 0.8417 - val_recall_1: 0.8382 Epoch 44/50 61/61 [==============================] - 34s 550ms/step - loss: 9.8478e-06 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.3196 - val_accuracy: 0.8423 - val_precision_1: 0.8417 - val_recall_1: 0.8382 Epoch 45/50 61/61 [==============================] - 34s 550ms/step - loss: 1.9872e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.3327 - val_accuracy: 0.8506 - val_precision_1: 0.8500 - val_recall_1: 0.8465 Epoch 46/50 61/61 [==============================] - 33s 548ms/step - loss: 1.7586e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.3401 - val_accuracy: 0.8506 - val_precision_1: 0.8500 - val_recall_1: 0.8465 Epoch 47/50 61/61 [==============================] - 34s 545ms/step - loss: 1.5153e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.3490 - val_accuracy: 0.8506 - val_precision_1: 0.8500 - val_recall_1: 0.8465 Epoch 48/50 61/61 [==============================] - 33s 542ms/step - loss: 1.9326e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.3638 - val_accuracy: 0.8465 - val_precision_1: 0.8500 - val_recall_1: 0.8465 Epoch 49/50 61/61 [==============================] - 34s 549ms/step - loss: 1.7379e-05 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.3728 - val_accuracy: 0.8423 - val_precision_1: 0.8494 - val_recall_1: 0.8423 Epoch 50/50 61/61 [==============================] - 33s 542ms/step - loss: 6.0117e-06 - accuracy: 1.0000 - precision_1: 1.0000 - recall_1: 1.0000 - val_loss: 1.3787 - val_accuracy: 0.8423 - val_precision_1: 0.8458 - val_recall_1: 0.8423
Plot learning curves.
plt.subplots_adjust(right=2.1, left=.09)
plt.subplot(1,3,1)
plt.plot(history2.history['accuracy'])
plt.plot(history2.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(history2.history['precision_1'])
plt.plot(history2.history['val_precision_1'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
#plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,3)
plt.plot(history2.history['recall_1'])
plt.plot(history2.history['val_recall_1'])
plt.ylabel('Recall')
plt.xlabel('')
#plt.legend(['training','validation'], loc="lower right")
plt.show()
Overfit Model 3:
overfit_model_3 = Sequential()
overfit_model_3.add( Conv2D(filters=128, kernel_size=3, activation = 'relu', input_shape = overfit_train.image_shape ) )
overfit_model_3.add( Conv2D(filters=64, kernel_size=3, activation = 'relu' ) )
overfit_model_3.add( Conv2D(filters=32, kernel_size=3, activation = 'relu' ) )
overfit_model_3.add( MaxPool2D(5,5))
overfit_model_3.add( Flatten())
overfit_model_3.add( Dense(units=120, activation = 'relu' ) )
overfit_model_3.add( Dense(units=60, activation = 'relu' ) )
overfit_model_3.add( Dense(units=30, activation = 'relu' ) )
overfit_model_3.add( Dense(units=3, activation = 'softmax' ) )
overfit_model_3.summary()
Model: "sequential_2" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= conv2d_11 (Conv2D) (None, 254, 254, 128) 3584 _________________________________________________________________ conv2d_12 (Conv2D) (None, 252, 252, 64) 73792 _________________________________________________________________ conv2d_13 (Conv2D) (None, 250, 250, 32) 18464 _________________________________________________________________ max_pooling2d_2 (MaxPooling2 (None, 50, 50, 32) 0 _________________________________________________________________ flatten_2 (Flatten) (None, 80000) 0 _________________________________________________________________ dense_9 (Dense) (None, 120) 9600120 _________________________________________________________________ dense_10 (Dense) (None, 60) 7260 _________________________________________________________________ dense_11 (Dense) (None, 30) 1830 _________________________________________________________________ dense_12 (Dense) (None, 3) 93 ================================================================= Total params: 9,705,143 Trainable params: 9,705,143 Non-trainable params: 0 _________________________________________________________________
overfit_model_3.compile( optimizer = 'adam', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
history3 = overfit_model_3.fit( overfit_train, validation_data = overfit_valid, epochs = 50, batch_size = 64 )
Epoch 1/50 61/61 [==============================] - 40s 605ms/step - loss: 1.3261 - accuracy: 0.3675 - precision_2: 0.3346 - recall_2: 0.0561 - val_loss: 0.8507 - val_accuracy: 0.6805 - val_precision_2: 0.7143 - val_recall_2: 0.3320 Epoch 2/50 61/61 [==============================] - 33s 543ms/step - loss: 0.7976 - accuracy: 0.7060 - precision_2: 0.7019 - recall_2: 0.4235 - val_loss: 0.6444 - val_accuracy: 0.7427 - val_precision_2: 0.7877 - val_recall_2: 0.6929 Epoch 3/50 61/61 [==============================] - 33s 541ms/step - loss: 0.4482 - accuracy: 0.8596 - precision_2: 0.8833 - recall_2: 0.8169 - val_loss: 0.4036 - val_accuracy: 0.8631 - val_precision_2: 0.8809 - val_recall_2: 0.8589 Epoch 4/50 61/61 [==============================] - 33s 537ms/step - loss: 0.1534 - accuracy: 0.9492 - precision_2: 0.9538 - recall_2: 0.9426 - val_loss: 0.4355 - val_accuracy: 0.8589 - val_precision_2: 0.8734 - val_recall_2: 0.8589 Epoch 5/50 61/61 [==============================] - 33s 547ms/step - loss: 0.0903 - accuracy: 0.9673 - precision_2: 0.9694 - recall_2: 0.9638 - val_loss: 0.5031 - val_accuracy: 0.8631 - val_precision_2: 0.8661 - val_recall_2: 0.8589 Epoch 6/50 61/61 [==============================] - 33s 539ms/step - loss: 0.0343 - accuracy: 0.9935 - precision_2: 0.9935 - recall_2: 0.9935 - val_loss: 0.5079 - val_accuracy: 0.8382 - val_precision_2: 0.8553 - val_recall_2: 0.8340 Epoch 7/50 61/61 [==============================] - 33s 543ms/step - loss: 0.0622 - accuracy: 0.9787 - precision_2: 0.9806 - recall_2: 0.9761 - val_loss: 0.7365 - val_accuracy: 0.8423 - val_precision_2: 0.8445 - val_recall_2: 0.8340 Epoch 8/50 61/61 [==============================] - 33s 534ms/step - loss: 0.0272 - accuracy: 0.9887 - precision_2: 0.9887 - recall_2: 0.9887 - val_loss: 0.6268 - val_accuracy: 0.8506 - val_precision_2: 0.8536 - val_recall_2: 0.8465 Epoch 9/50 61/61 [==============================] - 33s 545ms/step - loss: 0.0054 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 0.9997 - val_loss: 0.7891 - val_accuracy: 0.8465 - val_precision_2: 0.8465 - val_recall_2: 0.8465 Epoch 10/50 61/61 [==============================] - 33s 545ms/step - loss: 7.7096e-04 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 0.9275 - val_accuracy: 0.8465 - val_precision_2: 0.8458 - val_recall_2: 0.8423 Epoch 11/50 61/61 [==============================] - 33s 549ms/step - loss: 4.2563e-04 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 0.9509 - val_accuracy: 0.8465 - val_precision_2: 0.8458 - val_recall_2: 0.8423 Epoch 12/50 61/61 [==============================] - 33s 549ms/step - loss: 3.0715e-04 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 0.9506 - val_accuracy: 0.8506 - val_precision_2: 0.8506 - val_recall_2: 0.8506 Epoch 13/50 61/61 [==============================] - 33s 543ms/step - loss: 2.2140e-04 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.0024 - val_accuracy: 0.8506 - val_precision_2: 0.8506 - val_recall_2: 0.8506 Epoch 14/50 61/61 [==============================] - 33s 541ms/step - loss: 1.4421e-04 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.0394 - val_accuracy: 0.8506 - val_precision_2: 0.8506 - val_recall_2: 0.8506 Epoch 15/50 61/61 [==============================] - 33s 539ms/step - loss: 9.6341e-05 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.0164 - val_accuracy: 0.8548 - val_precision_2: 0.8548 - val_recall_2: 0.8548 Epoch 16/50 61/61 [==============================] - 33s 540ms/step - loss: 7.5371e-05 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.0314 - val_accuracy: 0.8506 - val_precision_2: 0.8506 - val_recall_2: 0.8506 Epoch 17/50 61/61 [==============================] - 33s 539ms/step - loss: 4.6063e-05 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.2054 - val_accuracy: 0.8506 - val_precision_2: 0.8506 - val_recall_2: 0.8506 Epoch 18/50 61/61 [==============================] - 33s 542ms/step - loss: 2.0514e-05 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.2377 - val_accuracy: 0.8548 - val_precision_2: 0.8548 - val_recall_2: 0.8548 Epoch 19/50 61/61 [==============================] - 33s 539ms/step - loss: 2.3346e-05 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.2904 - val_accuracy: 0.8506 - val_precision_2: 0.8506 - val_recall_2: 0.8506 Epoch 20/50 61/61 [==============================] - 33s 540ms/step - loss: 1.8090e-05 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.2149 - val_accuracy: 0.8506 - val_precision_2: 0.8506 - val_recall_2: 0.8506 Epoch 21/50 61/61 [==============================] - 33s 537ms/step - loss: 1.7846e-05 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.3070 - val_accuracy: 0.8548 - val_precision_2: 0.8548 - val_recall_2: 0.8548 Epoch 22/50 61/61 [==============================] - 33s 539ms/step - loss: 7.9726e-06 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.4518 - val_accuracy: 0.8465 - val_precision_2: 0.8465 - val_recall_2: 0.8465 Epoch 23/50 61/61 [==============================] - 33s 538ms/step - loss: 7.9075e-06 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.4972 - val_accuracy: 0.8465 - val_precision_2: 0.8465 - val_recall_2: 0.8465 Epoch 24/50 61/61 [==============================] - 33s 542ms/step - loss: 6.8151e-06 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.5252 - val_accuracy: 0.8423 - val_precision_2: 0.8417 - val_recall_2: 0.8382 Epoch 25/50 61/61 [==============================] - 33s 539ms/step - loss: 3.8717e-06 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.5477 - val_accuracy: 0.8423 - val_precision_2: 0.8417 - val_recall_2: 0.8382 Epoch 26/50 61/61 [==============================] - 33s 537ms/step - loss: 2.9182e-06 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.5460 - val_accuracy: 0.8423 - val_precision_2: 0.8417 - val_recall_2: 0.8382 Epoch 27/50 61/61 [==============================] - 33s 538ms/step - loss: 2.3381e-06 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.5884 - val_accuracy: 0.8340 - val_precision_2: 0.8410 - val_recall_2: 0.8340 Epoch 28/50 61/61 [==============================] - 33s 537ms/step - loss: 1.7676e-06 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.5767 - val_accuracy: 0.8382 - val_precision_2: 0.8417 - val_recall_2: 0.8382 Epoch 29/50 61/61 [==============================] - 33s 541ms/step - loss: 2.1781e-06 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.5964 - val_accuracy: 0.8382 - val_precision_2: 0.8382 - val_recall_2: 0.8382 Epoch 30/50 61/61 [==============================] - 33s 538ms/step - loss: 1.7379e-06 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.6208 - val_accuracy: 0.8382 - val_precision_2: 0.8375 - val_recall_2: 0.8340 Epoch 31/50 61/61 [==============================] - 33s 537ms/step - loss: 1.7746e-06 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.6173 - val_accuracy: 0.8382 - val_precision_2: 0.8375 - val_recall_2: 0.8340 Epoch 32/50 61/61 [==============================] - 33s 538ms/step - loss: 1.7316e-06 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.6615 - val_accuracy: 0.8340 - val_precision_2: 0.8375 - val_recall_2: 0.8340 Epoch 33/50 61/61 [==============================] - 33s 535ms/step - loss: 1.0211e-06 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.6903 - val_accuracy: 0.8340 - val_precision_2: 0.8375 - val_recall_2: 0.8340 Epoch 34/50 61/61 [==============================] - 33s 536ms/step - loss: 1.2663e-06 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.7111 - val_accuracy: 0.8340 - val_precision_2: 0.8375 - val_recall_2: 0.8340 Epoch 35/50 61/61 [==============================] - 33s 528ms/step - loss: 1.1687e-06 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.7295 - val_accuracy: 0.8340 - val_precision_2: 0.8375 - val_recall_2: 0.8340 Epoch 36/50 61/61 [==============================] - 33s 538ms/step - loss: 7.2186e-07 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.7279 - val_accuracy: 0.8340 - val_precision_2: 0.8375 - val_recall_2: 0.8340 Epoch 37/50 61/61 [==============================] - 33s 540ms/step - loss: 7.7826e-07 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.7475 - val_accuracy: 0.8340 - val_precision_2: 0.8375 - val_recall_2: 0.8340 Epoch 38/50 61/61 [==============================] - 33s 539ms/step - loss: 8.4888e-07 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.7572 - val_accuracy: 0.8340 - val_precision_2: 0.8340 - val_recall_2: 0.8340 Epoch 39/50 61/61 [==============================] - 33s 530ms/step - loss: 5.9900e-07 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.7588 - val_accuracy: 0.8340 - val_precision_2: 0.8340 - val_recall_2: 0.8340 Epoch 40/50 61/61 [==============================] - 33s 541ms/step - loss: 5.9363e-07 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.7725 - val_accuracy: 0.8340 - val_precision_2: 0.8340 - val_recall_2: 0.8340 Epoch 41/50 61/61 [==============================] - 33s 540ms/step - loss: 4.6920e-07 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.7764 - val_accuracy: 0.8340 - val_precision_2: 0.8340 - val_recall_2: 0.8340 Epoch 42/50 61/61 [==============================] - 33s 540ms/step - loss: 8.4342e-07 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.7771 - val_accuracy: 0.8340 - val_precision_2: 0.8340 - val_recall_2: 0.8340 Epoch 43/50 61/61 [==============================] - 33s 542ms/step - loss: 4.3397e-07 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.7835 - val_accuracy: 0.8382 - val_precision_2: 0.8382 - val_recall_2: 0.8382 Epoch 44/50 61/61 [==============================] - 33s 537ms/step - loss: 4.7764e-07 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.7873 - val_accuracy: 0.8382 - val_precision_2: 0.8382 - val_recall_2: 0.8382 Epoch 45/50 61/61 [==============================] - 33s 540ms/step - loss: 3.6312e-07 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.8099 - val_accuracy: 0.8340 - val_precision_2: 0.8340 - val_recall_2: 0.8340 Epoch 46/50 61/61 [==============================] - 33s 538ms/step - loss: 3.8988e-07 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.8174 - val_accuracy: 0.8340 - val_precision_2: 0.8340 - val_recall_2: 0.8340 Epoch 47/50 61/61 [==============================] - 33s 541ms/step - loss: 3.9957e-07 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.8293 - val_accuracy: 0.8340 - val_precision_2: 0.8340 - val_recall_2: 0.8340 Epoch 48/50 61/61 [==============================] - 33s 537ms/step - loss: 2.9939e-07 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.8222 - val_accuracy: 0.8382 - val_precision_2: 0.8382 - val_recall_2: 0.8382 Epoch 49/50 61/61 [==============================] - 33s 538ms/step - loss: 3.3049e-07 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.8349 - val_accuracy: 0.8340 - val_precision_2: 0.8340 - val_recall_2: 0.8340 Epoch 50/50 61/61 [==============================] - 33s 538ms/step - loss: 2.4985e-07 - accuracy: 1.0000 - precision_2: 1.0000 - recall_2: 1.0000 - val_loss: 1.8440 - val_accuracy: 0.8382 - val_precision_2: 0.8382 - val_recall_2: 0.8382
Plot learning curves
plt.subplots_adjust(right=2.1, left=.09)
plt.subplot(1,3,1)
plt.plot(history3.history['accuracy'])
plt.plot(history3.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(history3.history['precision_2'])
plt.plot(history3.history['val_precision_2'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
#plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,3)
plt.plot(history3.history['recall_2'])
plt.plot(history3.history['val_recall_2'])
plt.ylabel('Recall')
plt.xlabel('')
#plt.legend(['training','validation'], loc="lower right")
plt.show()
Now that we know our model can be over-trained, we will focus on building a model that can be trained well.
validation_data = image_generator.flow_from_directory( 'bears/validation', target_size=(256, 256), batch_size=9, class_mode='categorical')
testing_data = image_generator.flow_from_directory( 'bears/test', target_size=(256, 256), batch_size=9, class_mode='categorical')
Found 243 images belonging to 3 classes. Found 243 images belonging to 3 classes.
from tensorflow.keras.callbacks import EarlyStopping
callback_earlystp = EarlyStopping(monitor='val_loss', mode='min', patience=15, verbose=1)
After multiple experiments, a model with the following structure appeared to be the most effective model:
model_1 = Sequential()
model_1.add( Conv2D(filters=16, kernel_size=3, activation = 'relu', input_shape = training_data.image_shape ) )
model_1.add( MaxPool2D(5,5))
model_1.add( Conv2D(filters=8, kernel_size=3, activation = 'relu' ) )
model_1.add( MaxPool2D(5,5))
model_1.add( Conv2D(filters=4, kernel_size=3, activation = 'relu' ) )
model_1.add( MaxPool2D(5,5))
model_1.add( Flatten())
model_1.add( Dense(units=5, activation = 'relu' ) )
model_1.add( Dense(units=3, activation = 'softmax' ) )
model_1.summary()
Model: "sequential_3" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= conv2d_9 (Conv2D) (None, 254, 254, 16) 448 _________________________________________________________________ max_pooling2d_9 (MaxPooling2 (None, 50, 50, 16) 0 _________________________________________________________________ conv2d_10 (Conv2D) (None, 48, 48, 8) 1160 _________________________________________________________________ max_pooling2d_10 (MaxPooling (None, 9, 9, 8) 0 _________________________________________________________________ conv2d_11 (Conv2D) (None, 7, 7, 4) 292 _________________________________________________________________ max_pooling2d_11 (MaxPooling (None, 1, 1, 4) 0 _________________________________________________________________ flatten_3 (Flatten) (None, 4) 0 _________________________________________________________________ dense_7 (Dense) (None, 5) 25 _________________________________________________________________ dense_8 (Dense) (None, 3) 18 ================================================================= Total params: 1,943 Trainable params: 1,943 Non-trainable params: 0 _________________________________________________________________
model_1.compile( optimizer = 'adam', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
fit_history_1 = model_1.fit( training_data, validation_data = validation_data, epochs = 50, batch_size = 64, callbacks = [callback_earlystp] )
Epoch 1/50 81/81 [==============================] - 26s 312ms/step - loss: 1.0914 - accuracy: 0.4185 - precision_4: 0.0000e+00 - recall_4: 0.0000e+00 - val_loss: 1.0588 - val_accuracy: 0.3539 - val_precision_4: 0.0000e+00 - val_recall_4: 0.0000e+00 Epoch 2/50 81/81 [==============================] - 25s 303ms/step - loss: 1.0274 - accuracy: 0.4645 - precision_4: 0.5851 - recall_4: 0.0592 - val_loss: 1.0994 - val_accuracy: 0.3416 - val_precision_4: 0.0000e+00 - val_recall_4: 0.0000e+00 Epoch 3/50 81/81 [==============================] - 24s 302ms/step - loss: 1.0824 - accuracy: 0.3303 - precision_4: 0.0000e+00 - recall_4: 0.0000e+00 - val_loss: 1.0577 - val_accuracy: 0.3868 - val_precision_4: 0.0000e+00 - val_recall_4: 0.0000e+00 Epoch 4/50 81/81 [==============================] - 25s 306ms/step - loss: 1.0072 - accuracy: 0.3792 - precision_4: 0.7278 - recall_4: 0.0739 - val_loss: 0.7923 - val_accuracy: 0.6008 - val_precision_4: 0.7255 - val_recall_4: 0.4568 Epoch 5/50 81/81 [==============================] - 25s 306ms/step - loss: 0.7767 - accuracy: 0.6029 - precision_4: 0.6875 - recall_4: 0.4423 - val_loss: 0.7095 - val_accuracy: 0.6790 - val_precision_4: 0.6779 - val_recall_4: 0.4156 Epoch 6/50 81/81 [==============================] - 25s 307ms/step - loss: 0.6936 - accuracy: 0.6269 - precision_4: 0.7071 - recall_4: 0.4348 - val_loss: 0.7203 - val_accuracy: 0.6132 - val_precision_4: 0.8144 - val_recall_4: 0.3251 Epoch 7/50 81/81 [==============================] - 25s 305ms/step - loss: 0.6800 - accuracy: 0.6586 - precision_4: 0.8364 - recall_4: 0.3412 - val_loss: 0.6915 - val_accuracy: 0.5926 - val_precision_4: 0.8036 - val_recall_4: 0.3704 Epoch 8/50 81/81 [==============================] - 25s 306ms/step - loss: 0.6793 - accuracy: 0.6497 - precision_4: 0.7861 - recall_4: 0.4491 - val_loss: 0.6940 - val_accuracy: 0.6091 - val_precision_4: 0.7714 - val_recall_4: 0.4444 Epoch 9/50 81/81 [==============================] - 25s 307ms/step - loss: 0.6399 - accuracy: 0.6673 - precision_4: 0.7866 - recall_4: 0.5066 - val_loss: 0.7039 - val_accuracy: 0.6626 - val_precision_4: 0.7092 - val_recall_4: 0.5720 Epoch 10/50 81/81 [==============================] - 25s 305ms/step - loss: 0.6425 - accuracy: 0.6740 - precision_4: 0.7522 - recall_4: 0.6023 - val_loss: 0.6771 - val_accuracy: 0.6584 - val_precision_4: 0.7919 - val_recall_4: 0.4856 Epoch 11/50 81/81 [==============================] - 25s 308ms/step - loss: 0.6244 - accuracy: 0.6884 - precision_4: 0.8034 - recall_4: 0.6014 - val_loss: 0.6627 - val_accuracy: 0.6337 - val_precision_4: 0.6881 - val_recall_4: 0.6173 Epoch 12/50 81/81 [==============================] - 25s 309ms/step - loss: 0.5666 - accuracy: 0.7252 - precision_4: 0.7850 - recall_4: 0.7054 - val_loss: 0.6800 - val_accuracy: 0.7078 - val_precision_4: 0.7747 - val_recall_4: 0.5802 Epoch 13/50 81/81 [==============================] - 25s 305ms/step - loss: 0.5873 - accuracy: 0.7402 - precision_4: 0.7877 - recall_4: 0.6467 - val_loss: 0.6010 - val_accuracy: 0.7284 - val_precision_4: 0.7500 - val_recall_4: 0.7160 Epoch 14/50 81/81 [==============================] - 25s 311ms/step - loss: 0.5451 - accuracy: 0.7558 - precision_4: 0.7838 - recall_4: 0.7310 - val_loss: 0.5621 - val_accuracy: 0.7778 - val_precision_4: 0.7881 - val_recall_4: 0.7654 Epoch 15/50 81/81 [==============================] - 25s 303ms/step - loss: 0.5157 - accuracy: 0.7773 - precision_4: 0.7942 - recall_4: 0.7611 - val_loss: 0.5971 - val_accuracy: 0.7860 - val_precision_4: 0.8026 - val_recall_4: 0.7531 Epoch 16/50 81/81 [==============================] - 25s 310ms/step - loss: 0.5606 - accuracy: 0.7579 - precision_4: 0.7660 - recall_4: 0.7281 - val_loss: 0.7542 - val_accuracy: 0.6584 - val_precision_4: 0.6596 - val_recall_4: 0.6379 Epoch 17/50 81/81 [==============================] - 25s 308ms/step - loss: 0.5658 - accuracy: 0.7651 - precision_4: 0.7849 - recall_4: 0.7404 - val_loss: 0.5152 - val_accuracy: 0.7984 - val_precision_4: 0.7950 - val_recall_4: 0.7819 Epoch 18/50 81/81 [==============================] - 25s 308ms/step - loss: 0.4847 - accuracy: 0.8114 - precision_4: 0.8273 - recall_4: 0.7894 - val_loss: 0.5016 - val_accuracy: 0.8066 - val_precision_4: 0.8125 - val_recall_4: 0.8025 Epoch 19/50 81/81 [==============================] - 25s 307ms/step - loss: 0.4653 - accuracy: 0.7913 - precision_4: 0.8142 - recall_4: 0.7796 - val_loss: 0.4711 - val_accuracy: 0.8230 - val_precision_4: 0.8277 - val_recall_4: 0.8107 Epoch 20/50 81/81 [==============================] - 25s 307ms/step - loss: 0.4407 - accuracy: 0.8081 - precision_4: 0.8285 - recall_4: 0.7872 - val_loss: 0.4859 - val_accuracy: 0.8230 - val_precision_4: 0.8383 - val_recall_4: 0.8107 Epoch 21/50 81/81 [==============================] - 25s 309ms/step - loss: 0.4686 - accuracy: 0.8280 - precision_4: 0.8442 - recall_4: 0.8075 - val_loss: 0.4513 - val_accuracy: 0.8107 - val_precision_4: 0.8178 - val_recall_4: 0.7942 Epoch 22/50 81/81 [==============================] - 25s 306ms/step - loss: 0.3937 - accuracy: 0.8323 - precision_4: 0.8537 - recall_4: 0.8286 - val_loss: 0.5725 - val_accuracy: 0.7449 - val_precision_4: 0.7595 - val_recall_4: 0.7407 Epoch 23/50 81/81 [==============================] - 25s 303ms/step - loss: 0.4119 - accuracy: 0.8443 - precision_4: 0.8589 - recall_4: 0.8313 - val_loss: 0.4171 - val_accuracy: 0.8395 - val_precision_4: 0.8565 - val_recall_4: 0.8354 Epoch 24/50 81/81 [==============================] - 25s 300ms/step - loss: 0.3872 - accuracy: 0.8290 - precision_4: 0.8544 - recall_4: 0.8165 - val_loss: 0.4192 - val_accuracy: 0.8189 - val_precision_4: 0.8270 - val_recall_4: 0.8066 Epoch 25/50 81/81 [==============================] - 25s 305ms/step - loss: 0.3768 - accuracy: 0.8491 - precision_4: 0.8526 - recall_4: 0.8377 - val_loss: 0.3873 - val_accuracy: 0.8642 - val_precision_4: 0.8991 - val_recall_4: 0.8436 Epoch 26/50 81/81 [==============================] - 25s 304ms/step - loss: 0.3698 - accuracy: 0.8657 - precision_4: 0.8823 - recall_4: 0.8379 - val_loss: 0.4014 - val_accuracy: 0.8230 - val_precision_4: 0.8340 - val_recall_4: 0.8066 Epoch 27/50 81/81 [==============================] - 25s 307ms/step - loss: 0.3633 - accuracy: 0.8603 - precision_4: 0.8750 - recall_4: 0.8512 - val_loss: 0.3569 - val_accuracy: 0.8436 - val_precision_4: 0.8644 - val_recall_4: 0.8395 Epoch 28/50 81/81 [==============================] - 25s 310ms/step - loss: 0.3415 - accuracy: 0.8724 - precision_4: 0.8866 - recall_4: 0.8670 - val_loss: 0.3544 - val_accuracy: 0.8395 - val_precision_4: 0.8681 - val_recall_4: 0.8395 Epoch 29/50 81/81 [==============================] - 25s 308ms/step - loss: 0.2980 - accuracy: 0.8860 - precision_4: 0.9005 - recall_4: 0.8740 - val_loss: 0.3225 - val_accuracy: 0.8807 - val_precision_4: 0.8941 - val_recall_4: 0.8683 Epoch 30/50 81/81 [==============================] - 25s 311ms/step - loss: 0.2865 - accuracy: 0.8981 - precision_4: 0.9074 - recall_4: 0.8883 - val_loss: 0.3900 - val_accuracy: 0.8724 - val_precision_4: 0.8814 - val_recall_4: 0.8560 Epoch 31/50 81/81 [==============================] - 25s 312ms/step - loss: 0.3010 - accuracy: 0.9000 - precision_4: 0.9045 - recall_4: 0.8882 - val_loss: 0.3901 - val_accuracy: 0.8354 - val_precision_4: 0.8621 - val_recall_4: 0.8230 Epoch 32/50 81/81 [==============================] - 25s 308ms/step - loss: 0.2864 - accuracy: 0.8812 - precision_4: 0.8911 - recall_4: 0.8747 - val_loss: 0.3146 - val_accuracy: 0.8889 - val_precision_4: 0.8983 - val_recall_4: 0.8724 Epoch 33/50 81/81 [==============================] - 25s 309ms/step - loss: 0.2819 - accuracy: 0.9015 - precision_4: 0.9216 - recall_4: 0.8821 - val_loss: 0.4785 - val_accuracy: 0.7984 - val_precision_4: 0.8095 - val_recall_4: 0.7695 Epoch 34/50 81/81 [==============================] - 25s 312ms/step - loss: 0.2821 - accuracy: 0.9072 - precision_4: 0.9171 - recall_4: 0.8943 - val_loss: 0.3094 - val_accuracy: 0.8971 - val_precision_4: 0.9103 - val_recall_4: 0.8765 Epoch 35/50 81/81 [==============================] - 25s 310ms/step - loss: 0.2475 - accuracy: 0.8977 - precision_4: 0.9150 - recall_4: 0.8824 - val_loss: 0.2743 - val_accuracy: 0.8971 - val_precision_4: 0.9110 - val_recall_4: 0.8848 Epoch 36/50 81/81 [==============================] - 25s 307ms/step - loss: 0.2374 - accuracy: 0.9229 - precision_4: 0.9326 - recall_4: 0.9165 - val_loss: 0.2798 - val_accuracy: 0.9053 - val_precision_4: 0.9149 - val_recall_4: 0.8848 Epoch 37/50 81/81 [==============================] - 25s 313ms/step - loss: 0.2128 - accuracy: 0.9268 - precision_4: 0.9293 - recall_4: 0.9118 - val_loss: 0.2873 - val_accuracy: 0.8765 - val_precision_4: 0.8828 - val_recall_4: 0.8683 Epoch 38/50 81/81 [==============================] - 25s 311ms/step - loss: 0.2055 - accuracy: 0.9228 - precision_4: 0.9359 - recall_4: 0.9017 - val_loss: 0.2736 - val_accuracy: 0.9053 - val_precision_4: 0.9231 - val_recall_4: 0.8889 Epoch 39/50 81/81 [==============================] - 25s 310ms/step - loss: 0.2047 - accuracy: 0.9292 - precision_4: 0.9393 - recall_4: 0.9248 - val_loss: 0.2870 - val_accuracy: 0.9053 - val_precision_4: 0.9079 - val_recall_4: 0.8930 Epoch 40/50 81/81 [==============================] - 25s 313ms/step - loss: 0.2245 - accuracy: 0.9073 - precision_4: 0.9181 - recall_4: 0.8977 - val_loss: 0.2994 - val_accuracy: 0.8765 - val_precision_4: 0.8856 - val_recall_4: 0.8601 Epoch 41/50 81/81 [==============================] - 25s 313ms/step - loss: 0.1942 - accuracy: 0.9314 - precision_4: 0.9401 - recall_4: 0.9238 - val_loss: 0.3545 - val_accuracy: 0.8683 - val_precision_4: 0.8734 - val_recall_4: 0.8519 Epoch 42/50 81/81 [==============================] - 25s 311ms/step - loss: 0.1612 - accuracy: 0.9531 - precision_4: 0.9570 - recall_4: 0.9459 - val_loss: 0.2637 - val_accuracy: 0.8930 - val_precision_4: 0.9068 - val_recall_4: 0.8807 Epoch 43/50 81/81 [==============================] - 25s 308ms/step - loss: 0.1790 - accuracy: 0.9430 - precision_4: 0.9465 - recall_4: 0.9359 - val_loss: 0.2739 - val_accuracy: 0.8889 - val_precision_4: 0.8889 - val_recall_4: 0.8889 Epoch 44/50 81/81 [==============================] - 26s 318ms/step - loss: 0.1843 - accuracy: 0.9435 - precision_4: 0.9500 - recall_4: 0.9370 - val_loss: 0.2989 - val_accuracy: 0.8971 - val_precision_4: 0.9042 - val_recall_4: 0.8930 Epoch 45/50 81/81 [==============================] - 25s 317ms/step - loss: 0.1930 - accuracy: 0.9243 - precision_4: 0.9371 - recall_4: 0.9227 - val_loss: 0.2856 - val_accuracy: 0.9053 - val_precision_4: 0.9083 - val_recall_4: 0.8971 Epoch 46/50 81/81 [==============================] - 25s 315ms/step - loss: 0.1806 - accuracy: 0.9288 - precision_4: 0.9337 - recall_4: 0.9248 - val_loss: 0.3124 - val_accuracy: 0.8971 - val_precision_4: 0.8987 - val_recall_4: 0.8765 Epoch 47/50 81/81 [==============================] - 26s 318ms/step - loss: 0.1787 - accuracy: 0.9377 - precision_4: 0.9443 - recall_4: 0.9350 - val_loss: 0.3141 - val_accuracy: 0.8724 - val_precision_4: 0.8833 - val_recall_4: 0.8724 Epoch 48/50 81/81 [==============================] - 26s 317ms/step - loss: 0.1616 - accuracy: 0.9404 - precision_4: 0.9445 - recall_4: 0.9319 - val_loss: 0.2844 - val_accuracy: 0.8807 - val_precision_4: 0.8894 - val_recall_4: 0.8601 Epoch 49/50 81/81 [==============================] - 26s 325ms/step - loss: 0.1941 - accuracy: 0.9185 - precision_4: 0.9220 - recall_4: 0.9120 - val_loss: 0.2582 - val_accuracy: 0.9095 - val_precision_4: 0.9244 - val_recall_4: 0.9053 Epoch 50/50 81/81 [==============================] - 26s 327ms/step - loss: 0.1313 - accuracy: 0.9553 - precision_4: 0.9602 - recall_4: 0.9513 - val_loss: 0.2648 - val_accuracy: 0.8971 - val_precision_4: 0.9046 - val_recall_4: 0.8971
plt.subplots_adjust(right=1.95, left=.03)
plt.subplot(1,3,1)
plt.plot(fit_history_1.history['accuracy'])
plt.plot(fit_history_1.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(fit_history_1.history['precision_4'])
plt.plot(fit_history_1.history['val_precision_4'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
#plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,3)
plt.plot(fit_history_1.history['recall_4'])
plt.plot(fit_history_1.history['val_recall_4'])
plt.ylabel('Recall')
plt.xlabel('')
#plt.legend(['training','validation'], loc="lower right")
plt.show()
test_loss, test_acc, test_precision, test_recall = model_1.evaluate(testing_data)
print('%s %.2f' % ('validation_acc: ', test_acc*100.0 ))
print('%s %.2f' % ('validation_loss:', test_loss ))
print('%s %.2f' % ('validation_precision:', test_precision ))
print('%s %.2f' % ('validation_recall:', test_recall ))
27/27 [==============================] - 6s 218ms/step - loss: 0.3333 - accuracy: 0.8971 - precision_4: 0.9038 - recall_4: 0.8889 validation_acc: 89.71 validation_loss: 0.33 validation_precision: 0.90 validation_recall: 0.89
We will decrease the model by a few layers to see how performance changes.
model_2 = Sequential()
model_2.add( Conv2D(filters=16, kernel_size=3, activation = 'relu', input_shape = training_data.image_shape ) )
model_2.add( MaxPool2D(5,5))
model_2.add( Conv2D(filters=8, kernel_size=3, activation = 'relu' ) )
model_2.add( MaxPool2D(5,5))
model_2.add( Conv2D(filters=4, kernel_size=3, activation = 'relu' ) )
model_2.add( MaxPool2D(5,5))
model_2.add( Flatten())
model_2.add( Dense(units=3, activation = 'softmax' ) )
model_2.summary()
Model: "sequential_4" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= conv2d_17 (Conv2D) (None, 254, 254, 16) 448 _________________________________________________________________ max_pooling2d_6 (MaxPooling2 (None, 50, 50, 16) 0 _________________________________________________________________ conv2d_18 (Conv2D) (None, 48, 48, 8) 1160 _________________________________________________________________ max_pooling2d_7 (MaxPooling2 (None, 9, 9, 8) 0 _________________________________________________________________ conv2d_19 (Conv2D) (None, 7, 7, 4) 292 _________________________________________________________________ max_pooling2d_8 (MaxPooling2 (None, 1, 1, 4) 0 _________________________________________________________________ flatten_4 (Flatten) (None, 4) 0 _________________________________________________________________ dense_15 (Dense) (None, 3) 15 ================================================================= Total params: 1,915 Trainable params: 1,915 Non-trainable params: 0 _________________________________________________________________
model_2.compile( optimizer = 'adam', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
fit_history_2 = model_2.fit( training_data, validation_data = validation_data, epochs = 50, batch_size = 64, callbacks = [callback_earlystp] )
Epoch 1/50 81/81 [==============================] - 26s 316ms/step - loss: 1.1370 - accuracy: 0.3504 - precision_4: 0.3345 - recall_4: 0.0203 - val_loss: 1.0676 - val_accuracy: 0.3621 - val_precision_4: 0.0000e+00 - val_recall_4: 0.0000e+00 Epoch 2/50 81/81 [==============================] - 25s 311ms/step - loss: 1.0568 - accuracy: 0.3808 - precision_4: 0.5812 - recall_4: 0.0097 - val_loss: 0.9732 - val_accuracy: 0.5309 - val_precision_4: 0.8293 - val_recall_4: 0.1399 Epoch 3/50 81/81 [==============================] - 25s 309ms/step - loss: 0.9826 - accuracy: 0.5365 - precision_4: 0.7863 - recall_4: 0.1665 - val_loss: 0.9174 - val_accuracy: 0.5556 - val_precision_4: 0.6762 - val_recall_4: 0.2922 Epoch 4/50 81/81 [==============================] - 25s 309ms/step - loss: 0.9213 - accuracy: 0.5754 - precision_4: 0.7375 - recall_4: 0.2234 - val_loss: 0.8478 - val_accuracy: 0.6132 - val_precision_4: 0.6045 - val_recall_4: 0.3333 Epoch 5/50 81/81 [==============================] - 25s 310ms/step - loss: 0.8786 - accuracy: 0.6017 - precision_4: 0.7033 - recall_4: 0.3244 - val_loss: 0.7400 - val_accuracy: 0.6955 - val_precision_4: 0.8000 - val_recall_4: 0.5267 Epoch 6/50 81/81 [==============================] - 25s 309ms/step - loss: 0.6706 - accuracy: 0.7269 - precision_4: 0.8289 - recall_4: 0.5908 - val_loss: 0.6118 - val_accuracy: 0.7778 - val_precision_4: 0.8252 - val_recall_4: 0.6996 Epoch 7/50 81/81 [==============================] - 25s 308ms/step - loss: 0.5733 - accuracy: 0.8033 - precision_4: 0.8357 - recall_4: 0.7235 - val_loss: 0.5605 - val_accuracy: 0.7737 - val_precision_4: 0.8125 - val_recall_4: 0.7490 Epoch 8/50 81/81 [==============================] - 25s 310ms/step - loss: 0.5261 - accuracy: 0.7980 - precision_4: 0.8374 - recall_4: 0.7538 - val_loss: 0.6085 - val_accuracy: 0.7572 - val_precision_4: 0.7892 - val_recall_4: 0.7243 Epoch 9/50 81/81 [==============================] - 25s 308ms/step - loss: 0.4588 - accuracy: 0.8249 - precision_4: 0.8449 - recall_4: 0.8069 - val_loss: 0.5041 - val_accuracy: 0.8189 - val_precision_4: 0.8398 - val_recall_4: 0.7984 Epoch 10/50 81/81 [==============================] - 25s 306ms/step - loss: 0.3903 - accuracy: 0.8676 - precision_4: 0.8814 - recall_4: 0.8440 - val_loss: 0.5024 - val_accuracy: 0.8189 - val_precision_4: 0.8465 - val_recall_4: 0.7942 Epoch 11/50 81/81 [==============================] - 25s 307ms/step - loss: 0.3480 - accuracy: 0.8910 - precision_4: 0.9014 - recall_4: 0.8706 - val_loss: 0.5011 - val_accuracy: 0.8230 - val_precision_4: 0.8448 - val_recall_4: 0.8066 Epoch 12/50 81/81 [==============================] - 25s 310ms/step - loss: 0.3756 - accuracy: 0.8489 - precision_4: 0.8743 - recall_4: 0.8366 - val_loss: 0.4753 - val_accuracy: 0.8313 - val_precision_4: 0.8412 - val_recall_4: 0.8066 Epoch 13/50 81/81 [==============================] - 25s 307ms/step - loss: 0.3833 - accuracy: 0.8503 - precision_4: 0.8662 - recall_4: 0.8363 - val_loss: 0.4599 - val_accuracy: 0.8230 - val_precision_4: 0.8397 - val_recall_4: 0.8189 Epoch 14/50 81/81 [==============================] - 25s 304ms/step - loss: 0.3502 - accuracy: 0.8780 - precision_4: 0.8884 - recall_4: 0.8623 - val_loss: 0.5513 - val_accuracy: 0.7860 - val_precision_4: 0.8122 - val_recall_4: 0.7654 Epoch 15/50 81/81 [==============================] - 25s 309ms/step - loss: 0.3339 - accuracy: 0.8841 - precision_4: 0.9043 - recall_4: 0.8764 - val_loss: 0.4797 - val_accuracy: 0.8313 - val_precision_4: 0.8491 - val_recall_4: 0.8107 Epoch 16/50 81/81 [==============================] - 25s 306ms/step - loss: 0.4098 - accuracy: 0.8548 - precision_4: 0.8655 - recall_4: 0.8328 - val_loss: 0.4598 - val_accuracy: 0.8354 - val_precision_4: 0.8481 - val_recall_4: 0.8272 Epoch 17/50 81/81 [==============================] - 25s 305ms/step - loss: 0.3461 - accuracy: 0.8722 - precision_4: 0.8970 - recall_4: 0.8611 - val_loss: 0.4666 - val_accuracy: 0.8313 - val_precision_4: 0.8478 - val_recall_4: 0.8025 Epoch 18/50 81/81 [==============================] - 25s 301ms/step - loss: 0.3130 - accuracy: 0.8975 - precision_4: 0.9062 - recall_4: 0.8751 - val_loss: 0.5105 - val_accuracy: 0.8148 - val_precision_4: 0.8263 - val_recall_4: 0.8025 Epoch 19/50 81/81 [==============================] - 25s 307ms/step - loss: 0.3424 - accuracy: 0.8723 - precision_4: 0.8812 - recall_4: 0.8614 - val_loss: 0.4375 - val_accuracy: 0.8395 - val_precision_4: 0.8468 - val_recall_4: 0.8189 Epoch 20/50 81/81 [==============================] - 25s 307ms/step - loss: 0.3478 - accuracy: 0.8838 - precision_4: 0.8940 - recall_4: 0.8646 - val_loss: 0.4286 - val_accuracy: 0.8272 - val_precision_4: 0.8312 - val_recall_4: 0.8107 Epoch 21/50 81/81 [==============================] - 25s 306ms/step - loss: 0.3928 - accuracy: 0.8557 - precision_4: 0.8617 - recall_4: 0.8221 - val_loss: 0.4313 - val_accuracy: 0.8272 - val_precision_4: 0.8448 - val_recall_4: 0.8066 Epoch 22/50 81/81 [==============================] - 25s 306ms/step - loss: 0.3573 - accuracy: 0.8841 - precision_4: 0.8890 - recall_4: 0.8783 - val_loss: 0.4325 - val_accuracy: 0.8272 - val_precision_4: 0.8426 - val_recall_4: 0.8148 Epoch 23/50 81/81 [==============================] - 25s 306ms/step - loss: 0.3402 - accuracy: 0.9043 - precision_4: 0.9088 - recall_4: 0.8884 - val_loss: 0.4318 - val_accuracy: 0.8395 - val_precision_4: 0.8517 - val_recall_4: 0.8272 Epoch 24/50 81/81 [==============================] - 25s 311ms/step - loss: 0.2745 - accuracy: 0.8851 - precision_4: 0.8998 - recall_4: 0.8730 - val_loss: 0.5754 - val_accuracy: 0.7860 - val_precision_4: 0.8034 - val_recall_4: 0.7737 Epoch 25/50 81/81 [==============================] - 25s 308ms/step - loss: 0.4052 - accuracy: 0.8440 - precision_4: 0.8544 - recall_4: 0.8367 - val_loss: 0.4421 - val_accuracy: 0.8436 - val_precision_4: 0.8565 - val_recall_4: 0.8354 Epoch 26/50 81/81 [==============================] - 25s 306ms/step - loss: 0.3087 - accuracy: 0.8954 - precision_4: 0.9018 - recall_4: 0.8770 - val_loss: 0.4272 - val_accuracy: 0.8477 - val_precision_4: 0.8523 - val_recall_4: 0.8313 Epoch 27/50 81/81 [==============================] - 25s 308ms/step - loss: 0.2891 - accuracy: 0.9085 - precision_4: 0.9209 - recall_4: 0.8999 - val_loss: 0.4223 - val_accuracy: 0.8477 - val_precision_4: 0.8638 - val_recall_4: 0.8354 Epoch 28/50 81/81 [==============================] - 25s 313ms/step - loss: 0.3479 - accuracy: 0.8710 - precision_4: 0.8760 - recall_4: 0.8577 - val_loss: 0.4189 - val_accuracy: 0.8519 - val_precision_4: 0.8602 - val_recall_4: 0.8354 Epoch 29/50 81/81 [==============================] - 26s 319ms/step - loss: 0.2699 - accuracy: 0.9155 - precision_4: 0.9207 - recall_4: 0.9074 - val_loss: 0.4243 - val_accuracy: 0.8724 - val_precision_4: 0.8787 - val_recall_4: 0.8642 Epoch 30/50 81/81 [==============================] - 26s 314ms/step - loss: 0.2622 - accuracy: 0.9127 - precision_4: 0.9170 - recall_4: 0.9070 - val_loss: 0.4128 - val_accuracy: 0.8519 - val_precision_4: 0.8565 - val_recall_4: 0.8354 Epoch 31/50 81/81 [==============================] - 25s 310ms/step - loss: 0.2549 - accuracy: 0.9142 - precision_4: 0.9207 - recall_4: 0.9079 - val_loss: 0.4084 - val_accuracy: 0.8642 - val_precision_4: 0.8655 - val_recall_4: 0.8477 Epoch 32/50 81/81 [==============================] - 25s 308ms/step - loss: 0.2401 - accuracy: 0.9338 - precision_4: 0.9378 - recall_4: 0.9164 - val_loss: 0.4221 - val_accuracy: 0.8436 - val_precision_4: 0.8571 - val_recall_4: 0.8395 Epoch 33/50 81/81 [==============================] - 25s 312ms/step - loss: 0.3040 - accuracy: 0.8947 - precision_4: 0.9091 - recall_4: 0.8844 - val_loss: 0.4094 - val_accuracy: 0.8642 - val_precision_4: 0.8803 - val_recall_4: 0.8477 Epoch 34/50 81/81 [==============================] - 25s 308ms/step - loss: 0.2628 - accuracy: 0.8941 - precision_4: 0.8974 - recall_4: 0.8857 - val_loss: 0.4351 - val_accuracy: 0.8189 - val_precision_4: 0.8257 - val_recall_4: 0.8189 Epoch 35/50 81/81 [==============================] - 25s 304ms/step - loss: 0.2842 - accuracy: 0.8859 - precision_4: 0.8942 - recall_4: 0.8757 - val_loss: 0.4530 - val_accuracy: 0.8148 - val_precision_4: 0.8472 - val_recall_4: 0.7984 Epoch 36/50 81/81 [==============================] - 25s 311ms/step - loss: 0.2809 - accuracy: 0.8903 - precision_4: 0.8983 - recall_4: 0.8868 - val_loss: 0.3938 - val_accuracy: 0.8765 - val_precision_4: 0.8809 - val_recall_4: 0.8519 Epoch 37/50 81/81 [==============================] - 25s 308ms/step - loss: 0.2564 - accuracy: 0.9292 - precision_4: 0.9323 - recall_4: 0.9193 - val_loss: 0.4175 - val_accuracy: 0.8889 - val_precision_4: 0.8912 - val_recall_4: 0.8765 Epoch 38/50 81/81 [==============================] - 25s 311ms/step - loss: 0.2338 - accuracy: 0.9275 - precision_4: 0.9309 - recall_4: 0.9232 - val_loss: 0.3934 - val_accuracy: 0.8807 - val_precision_4: 0.8974 - val_recall_4: 0.8642 Epoch 39/50 81/81 [==============================] - 25s 308ms/step - loss: 0.2529 - accuracy: 0.9186 - precision_4: 0.9221 - recall_4: 0.9139 - val_loss: 0.4183 - val_accuracy: 0.8477 - val_precision_4: 0.8565 - val_recall_4: 0.8354 Epoch 40/50 81/81 [==============================] - 25s 312ms/step - loss: 0.1985 - accuracy: 0.9253 - precision_4: 0.9331 - recall_4: 0.9183 - val_loss: 0.4650 - val_accuracy: 0.8724 - val_precision_4: 0.8755 - val_recall_4: 0.8683 Epoch 41/50 81/81 [==============================] - 25s 311ms/step - loss: 0.2590 - accuracy: 0.9044 - precision_4: 0.9134 - recall_4: 0.8845 - val_loss: 0.3947 - val_accuracy: 0.8848 - val_precision_4: 0.8941 - val_recall_4: 0.8683 Epoch 42/50 81/81 [==============================] - 25s 313ms/step - loss: 0.2524 - accuracy: 0.9081 - precision_4: 0.9147 - recall_4: 0.8995 - val_loss: 0.3952 - val_accuracy: 0.8889 - val_precision_4: 0.9013 - val_recall_4: 0.8642 Epoch 43/50 81/81 [==============================] - 25s 310ms/step - loss: 0.2431 - accuracy: 0.9062 - precision_4: 0.9148 - recall_4: 0.9018 - val_loss: 0.4419 - val_accuracy: 0.8848 - val_precision_4: 0.8912 - val_recall_4: 0.8765 Epoch 44/50 81/81 [==============================] - 26s 319ms/step - loss: 0.2449 - accuracy: 0.9136 - precision_4: 0.9149 - recall_4: 0.9086 - val_loss: 0.4191 - val_accuracy: 0.8354 - val_precision_4: 0.8523 - val_recall_4: 0.8313 Epoch 45/50 81/81 [==============================] - 25s 315ms/step - loss: 0.2569 - accuracy: 0.9057 - precision_4: 0.9173 - recall_4: 0.9051 - val_loss: 0.4715 - val_accuracy: 0.8436 - val_precision_4: 0.8565 - val_recall_4: 0.8354 Epoch 46/50 81/81 [==============================] - 25s 310ms/step - loss: 0.2535 - accuracy: 0.9095 - precision_4: 0.9225 - recall_4: 0.9061 - val_loss: 0.4042 - val_accuracy: 0.8889 - val_precision_4: 0.8996 - val_recall_4: 0.8848 Epoch 47/50 81/81 [==============================] - 25s 310ms/step - loss: 0.2118 - accuracy: 0.9254 - precision_4: 0.9271 - recall_4: 0.9198 - val_loss: 0.4076 - val_accuracy: 0.8889 - val_precision_4: 0.8875 - val_recall_4: 0.8765 Epoch 48/50 81/81 [==============================] - 25s 312ms/step - loss: 0.2325 - accuracy: 0.9113 - precision_4: 0.9146 - recall_4: 0.8986 - val_loss: 0.4259 - val_accuracy: 0.8477 - val_precision_4: 0.8494 - val_recall_4: 0.8354 Epoch 49/50 81/81 [==============================] - 25s 310ms/step - loss: 0.1938 - accuracy: 0.9358 - precision_4: 0.9388 - recall_4: 0.9304 - val_loss: 0.4013 - val_accuracy: 0.8848 - val_precision_4: 0.8945 - val_recall_4: 0.8724 Epoch 50/50 81/81 [==============================] - 25s 309ms/step - loss: 0.2414 - accuracy: 0.9123 - precision_4: 0.9205 - recall_4: 0.9072 - val_loss: 0.4572 - val_accuracy: 0.8107 - val_precision_4: 0.8186 - val_recall_4: 0.7984
plt.subplots_adjust(right=1.95, left=.03)
plt.subplot(1,3,1)
plt.plot(fit_history_2.history['accuracy'])
plt.plot(fit_history_2.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(fit_history_2.history['precision_4'])
plt.plot(fit_history_2.history['val_precision_4'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
#plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,3)
plt.plot(fit_history_2.history['recall_4'])
plt.plot(fit_history_2.history['val_recall_4'])
plt.ylabel('Recall')
plt.xlabel('')
#plt.legend(['training','validation'], loc="lower right")
plt.show()
test_loss, test_acc, test_precision, test_recall = model_2.evaluate(testing_data)
print('%s %.2f' % ('validation_acc: ', test_acc*100.0 ))
print('%s %.2f' % ('validation_loss:', test_loss ))
print('%s %.2f' % ('validation_precision:', test_precision ))
print('%s %.2f' % ('validation_recall:', test_recall ))
27/27 [==============================] - 6s 208ms/step - loss: 0.4525 - accuracy: 0.8519 - precision_4: 0.8638 - recall_4: 0.8354 validation_acc: 85.19 validation_loss: 0.45 validation_precision: 0.86 validation_recall: 0.84
We will now increase the number of layers and see performance.
model_3 = Sequential()
model_3.add( Conv2D(filters=16, kernel_size=3, activation = 'relu', input_shape = training_data.image_shape ) )
model_3.add( MaxPool2D(5,5))
model_3.add( Conv2D(filters=8, kernel_size=3, activation = 'relu' ) )
model_3.add( MaxPool2D(5,5))
model_3.add( Conv2D(filters=4, kernel_size=3, activation = 'relu' ) )
model_3.add( MaxPool2D(5,5))
model_3.add( Flatten())
model_3.add( Dense(units=10, activation="relu"))
model_3.add( Dense(units=5, activation="relu"))
model_3.add( Dense(units=3, activation = 'softmax' ) )
model_3.summary()
Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= conv2d (Conv2D) (None, 254, 254, 16) 448 _________________________________________________________________ max_pooling2d (MaxPooling2D) (None, 50, 50, 16) 0 _________________________________________________________________ conv2d_1 (Conv2D) (None, 48, 48, 8) 1160 _________________________________________________________________ max_pooling2d_1 (MaxPooling2 (None, 9, 9, 8) 0 _________________________________________________________________ conv2d_2 (Conv2D) (None, 7, 7, 4) 292 _________________________________________________________________ max_pooling2d_2 (MaxPooling2 (None, 1, 1, 4) 0 _________________________________________________________________ flatten (Flatten) (None, 4) 0 _________________________________________________________________ dense (Dense) (None, 10) 50 _________________________________________________________________ dense_1 (Dense) (None, 5) 55 _________________________________________________________________ dense_2 (Dense) (None, 3) 18 ================================================================= Total params: 2,023 Trainable params: 2,023 Non-trainable params: 0 _________________________________________________________________
model_3.compile( optimizer = 'adam', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
fit_history_3 = model_3.fit( training_data, validation_data = validation_data, epochs = 50, batch_size = 64, callbacks = [callback_earlystp] )
Epoch 1/50 81/81 [==============================] - 58s 317ms/step - loss: 1.0971 - accuracy: 0.3357 - precision: 0.0000e+00 - recall: 0.0000e+00 - val_loss: 1.0948 - val_accuracy: 0.3909 - val_precision: 0.0000e+00 - val_recall: 0.0000e+00 Epoch 2/50 81/81 [==============================] - 25s 313ms/step - loss: 1.0929 - accuracy: 0.4337 - precision: 0.0000e+00 - recall: 0.0000e+00 - val_loss: 1.0832 - val_accuracy: 0.4527 - val_precision: 0.0000e+00 - val_recall: 0.0000e+00 Epoch 3/50 81/81 [==============================] - 25s 311ms/step - loss: 1.0765 - accuracy: 0.4585 - precision: 0.3089 - recall: 0.0027 - val_loss: 1.0519 - val_accuracy: 0.3539 - val_precision: 1.0000 - val_recall: 0.0041 Epoch 4/50 81/81 [==============================] - 25s 310ms/step - loss: 1.0250 - accuracy: 0.4766 - precision: 0.6457 - recall: 0.0969 - val_loss: 0.9565 - val_accuracy: 0.5679 - val_precision: 0.9259 - val_recall: 0.2058 Epoch 5/50 81/81 [==============================] - 25s 309ms/step - loss: 0.9313 - accuracy: 0.5504 - precision: 0.8438 - recall: 0.2604 - val_loss: 0.8523 - val_accuracy: 0.6255 - val_precision: 0.9079 - val_recall: 0.2840 Epoch 6/50 81/81 [==============================] - 25s 309ms/step - loss: 0.8227 - accuracy: 0.5653 - precision: 0.8437 - recall: 0.3029 - val_loss: 0.8167 - val_accuracy: 0.5638 - val_precision: 0.9091 - val_recall: 0.2469 Epoch 7/50 81/81 [==============================] - 25s 309ms/step - loss: 0.8101 - accuracy: 0.6183 - precision: 0.8353 - recall: 0.2394 - val_loss: 0.7200 - val_accuracy: 0.5885 - val_precision: 0.8706 - val_recall: 0.3045 Epoch 8/50 81/81 [==============================] - 25s 308ms/step - loss: 0.7111 - accuracy: 0.6281 - precision: 0.8711 - recall: 0.2953 - val_loss: 0.6899 - val_accuracy: 0.5844 - val_precision: 0.8961 - val_recall: 0.2840 Epoch 9/50 81/81 [==============================] - 25s 305ms/step - loss: 0.6733 - accuracy: 0.5806 - precision: 0.9172 - recall: 0.2821 - val_loss: 0.7144 - val_accuracy: 0.5514 - val_precision: 0.9375 - val_recall: 0.2469 Epoch 10/50 81/81 [==============================] - 25s 313ms/step - loss: 0.6536 - accuracy: 0.6191 - precision: 0.9345 - recall: 0.3055 - val_loss: 0.6569 - val_accuracy: 0.5885 - val_precision: 0.9103 - val_recall: 0.2922 Epoch 11/50 81/81 [==============================] - 25s 311ms/step - loss: 0.6387 - accuracy: 0.6332 - precision: 0.8656 - recall: 0.3622 - val_loss: 0.6386 - val_accuracy: 0.5926 - val_precision: 0.7134 - val_recall: 0.4609 Epoch 12/50 81/81 [==============================] - 25s 313ms/step - loss: 0.5971 - accuracy: 0.6618 - precision: 0.7984 - recall: 0.4992 - val_loss: 0.6656 - val_accuracy: 0.5720 - val_precision: 0.6811 - val_recall: 0.5185 Epoch 13/50 81/81 [==============================] - 25s 309ms/step - loss: 0.5788 - accuracy: 0.6680 - precision: 0.7911 - recall: 0.5186 - val_loss: 0.6278 - val_accuracy: 0.6626 - val_precision: 0.6957 - val_recall: 0.5267 Epoch 14/50 81/81 [==============================] - 25s 312ms/step - loss: 0.6176 - accuracy: 0.6638 - precision: 0.7379 - recall: 0.4715 - val_loss: 0.6271 - val_accuracy: 0.6914 - val_precision: 0.7888 - val_recall: 0.5226 Epoch 15/50 81/81 [==============================] - 25s 303ms/step - loss: 0.5427 - accuracy: 0.7507 - precision: 0.8281 - recall: 0.5624 - val_loss: 0.5818 - val_accuracy: 0.8148 - val_precision: 0.8756 - val_recall: 0.6955 Epoch 16/50 81/81 [==============================] - 25s 307ms/step - loss: 0.5001 - accuracy: 0.8033 - precision: 0.8654 - recall: 0.7326 - val_loss: 0.5577 - val_accuracy: 0.7984 - val_precision: 0.8431 - val_recall: 0.7078 Epoch 17/50 81/81 [==============================] - 25s 305ms/step - loss: 0.4761 - accuracy: 0.8324 - precision: 0.8741 - recall: 0.7589 - val_loss: 0.5238 - val_accuracy: 0.8395 - val_precision: 0.8750 - val_recall: 0.8066 Epoch 18/50 81/81 [==============================] - 25s 309ms/step - loss: 0.4154 - accuracy: 0.8517 - precision: 0.8642 - recall: 0.8164 - val_loss: 0.4779 - val_accuracy: 0.8601 - val_precision: 0.8933 - val_recall: 0.8272 Epoch 19/50 81/81 [==============================] - 25s 310ms/step - loss: 0.3715 - accuracy: 0.8793 - precision: 0.9016 - recall: 0.8606 - val_loss: 0.4523 - val_accuracy: 0.8601 - val_precision: 0.8755 - val_recall: 0.8395 Epoch 20/50 81/81 [==============================] - 25s 308ms/step - loss: 0.3400 - accuracy: 0.8841 - precision: 0.9061 - recall: 0.8691 - val_loss: 0.4270 - val_accuracy: 0.8560 - val_precision: 0.8788 - val_recall: 0.8354 Epoch 21/50 81/81 [==============================] - 25s 308ms/step - loss: 0.2916 - accuracy: 0.9033 - precision: 0.9179 - recall: 0.8827 - val_loss: 0.4130 - val_accuracy: 0.8560 - val_precision: 0.8631 - val_recall: 0.8560 Epoch 22/50 81/81 [==============================] - 25s 309ms/step - loss: 0.2669 - accuracy: 0.9127 - precision: 0.9251 - recall: 0.8927 - val_loss: 0.3934 - val_accuracy: 0.8724 - val_precision: 0.8803 - val_recall: 0.8477 Epoch 23/50 81/81 [==============================] - 25s 311ms/step - loss: 0.2525 - accuracy: 0.9168 - precision: 0.9319 - recall: 0.9078 - val_loss: 0.3983 - val_accuracy: 0.8601 - val_precision: 0.8692 - val_recall: 0.8477 Epoch 24/50 81/81 [==============================] - 25s 308ms/step - loss: 0.2228 - accuracy: 0.9161 - precision: 0.9260 - recall: 0.9072 - val_loss: 0.3590 - val_accuracy: 0.8848 - val_precision: 0.8843 - val_recall: 0.8807 Epoch 25/50 81/81 [==============================] - 25s 309ms/step - loss: 0.2094 - accuracy: 0.9421 - precision: 0.9495 - recall: 0.9347 - val_loss: 0.3486 - val_accuracy: 0.8807 - val_precision: 0.8828 - val_recall: 0.8683 Epoch 26/50 81/81 [==============================] - 25s 304ms/step - loss: 0.2308 - accuracy: 0.9060 - precision: 0.9108 - recall: 0.9051 - val_loss: 0.3535 - val_accuracy: 0.8889 - val_precision: 0.8958 - val_recall: 0.8848 Epoch 27/50 81/81 [==============================] - 25s 310ms/step - loss: 0.2088 - accuracy: 0.9327 - precision: 0.9404 - recall: 0.9302 - val_loss: 0.3445 - val_accuracy: 0.8930 - val_precision: 0.8996 - val_recall: 0.8848 Epoch 28/50 81/81 [==============================] - 25s 309ms/step - loss: 0.2045 - accuracy: 0.9331 - precision: 0.9441 - recall: 0.9291 - val_loss: 0.3350 - val_accuracy: 0.8642 - val_precision: 0.8771 - val_recall: 0.8519 Epoch 29/50 81/81 [==============================] - 25s 310ms/step - loss: 0.1646 - accuracy: 0.9435 - precision: 0.9485 - recall: 0.9362 - val_loss: 0.3523 - val_accuracy: 0.8683 - val_precision: 0.8678 - val_recall: 0.8642 Epoch 30/50 81/81 [==============================] - 25s 310ms/step - loss: 0.1346 - accuracy: 0.9514 - precision: 0.9522 - recall: 0.9427 - val_loss: 0.3665 - val_accuracy: 0.8601 - val_precision: 0.8739 - val_recall: 0.8560 Epoch 31/50 81/81 [==============================] - 25s 309ms/step - loss: 0.1323 - accuracy: 0.9585 - precision: 0.9606 - recall: 0.9543 - val_loss: 0.3134 - val_accuracy: 0.8889 - val_precision: 0.8884 - val_recall: 0.8848 Epoch 32/50 81/81 [==============================] - 25s 307ms/step - loss: 0.1128 - accuracy: 0.9659 - precision: 0.9687 - recall: 0.9609 - val_loss: 0.5289 - val_accuracy: 0.8395 - val_precision: 0.8465 - val_recall: 0.8395 Epoch 33/50 81/81 [==============================] - 25s 309ms/step - loss: 0.1774 - accuracy: 0.9341 - precision: 0.9394 - recall: 0.9224 - val_loss: 0.3307 - val_accuracy: 0.8807 - val_precision: 0.8838 - val_recall: 0.8765 Epoch 34/50 81/81 [==============================] - 25s 310ms/step - loss: 0.0929 - accuracy: 0.9630 - precision: 0.9719 - recall: 0.9584 - val_loss: 0.3318 - val_accuracy: 0.8807 - val_precision: 0.8843 - val_recall: 0.8807 Epoch 35/50 81/81 [==============================] - 25s 308ms/step - loss: 0.1194 - accuracy: 0.9456 - precision: 0.9454 - recall: 0.9421 - val_loss: 0.3141 - val_accuracy: 0.8889 - val_precision: 0.8921 - val_recall: 0.8848 Epoch 36/50 81/81 [==============================] - 25s 306ms/step - loss: 0.0865 - accuracy: 0.9681 - precision: 0.9680 - recall: 0.9614 - val_loss: 0.4496 - val_accuracy: 0.8642 - val_precision: 0.8636 - val_recall: 0.8601 Epoch 37/50 81/81 [==============================] - 25s 307ms/step - loss: 0.1261 - accuracy: 0.9515 - precision: 0.9526 - recall: 0.9472 - val_loss: 0.3178 - val_accuracy: 0.8889 - val_precision: 0.8926 - val_recall: 0.8889 Epoch 38/50 81/81 [==============================] - 25s 310ms/step - loss: 0.0979 - accuracy: 0.9716 - precision: 0.9718 - recall: 0.9701 - val_loss: 0.3560 - val_accuracy: 0.8765 - val_precision: 0.8760 - val_recall: 0.8724 Epoch 39/50 81/81 [==============================] - 25s 303ms/step - loss: 0.0605 - accuracy: 0.9768 - precision: 0.9821 - recall: 0.9721 - val_loss: 0.4110 - val_accuracy: 0.8848 - val_precision: 0.8921 - val_recall: 0.8848 Epoch 40/50 81/81 [==============================] - 25s 306ms/step - loss: 0.1031 - accuracy: 0.9516 - precision: 0.9525 - recall: 0.9496 - val_loss: 0.3525 - val_accuracy: 0.8848 - val_precision: 0.8848 - val_recall: 0.8848 Epoch 41/50 81/81 [==============================] - 25s 307ms/step - loss: 0.0676 - accuracy: 0.9782 - precision: 0.9808 - recall: 0.9778 - val_loss: 0.3322 - val_accuracy: 0.8930 - val_precision: 0.8930 - val_recall: 0.8930 Epoch 42/50 81/81 [==============================] - 25s 309ms/step - loss: 0.0732 - accuracy: 0.9748 - precision: 0.9747 - recall: 0.9745 - val_loss: 0.3531 - val_accuracy: 0.8807 - val_precision: 0.8917 - val_recall: 0.8807 Epoch 43/50 81/81 [==============================] - 25s 308ms/step - loss: 0.0601 - accuracy: 0.9761 - precision: 0.9842 - recall: 0.9756 - val_loss: 0.3578 - val_accuracy: 0.8765 - val_precision: 0.8765 - val_recall: 0.8765 Epoch 44/50 81/81 [==============================] - 25s 307ms/step - loss: 0.0587 - accuracy: 0.9837 - precision: 0.9837 - recall: 0.9810 - val_loss: 0.3502 - val_accuracy: 0.8930 - val_precision: 0.8926 - val_recall: 0.8889 Epoch 45/50 81/81 [==============================] - 25s 309ms/step - loss: 0.0427 - accuracy: 0.9896 - precision: 0.9901 - recall: 0.9896 - val_loss: 0.3874 - val_accuracy: 0.8971 - val_precision: 0.8971 - val_recall: 0.8971 Epoch 46/50 81/81 [==============================] - 25s 308ms/step - loss: 0.0522 - accuracy: 0.9888 - precision: 0.9892 - recall: 0.9883 - val_loss: 0.3580 - val_accuracy: 0.8971 - val_precision: 0.8971 - val_recall: 0.8971 Epoch 00046: early stopping
plt.subplots_adjust(right=1.95, left=.03)
plt.subplot(1,3,1)
plt.plot(fit_history_3.history['accuracy'])
plt.plot(fit_history_3.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(fit_history_3.history['precision'])
plt.plot(fit_history_3.history['val_precision'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
#plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,3)
plt.plot(fit_history_3.history['recall'])
plt.plot(fit_history_3.history['val_recall'])
plt.ylabel('Recall')
plt.xlabel('')
#plt.legend(['training','validation'], loc="lower right")
plt.show()
test_loss, test_acc, test_precision, test_recall = model_3.evaluate(testing_data)
print('%s %.2f' % ('validation_acc: ', test_acc*100.0 ))
print('%s %.2f' % ('validation_loss:', test_loss ))
print('%s %.2f' % ('validation_precision:', test_precision ))
print('%s %.2f' % ('validation_recall:', test_recall ))
27/27 [==============================] - 6s 214ms/step - loss: 0.4167 - accuracy: 0.8971 - precision: 0.9008 - recall: 0.8971 validation_acc: 89.71 validation_loss: 0.42 validation_precision: 0.90 validation_recall: 0.90
We will now decrease the number of filters at each level and evaluate performance
model_4 = Sequential()
model_4.add( Conv2D(filters=8, kernel_size=3, activation = 'relu', input_shape = training_data.image_shape ) )
model_4.add( MaxPool2D(5,5))
model_4.add( Conv2D(filters=4, kernel_size=3, activation = 'relu' ) )
model_4.add( MaxPool2D(5,5))
model_4.add( Conv2D(filters=2, kernel_size=3, activation = 'relu' ) )
model_4.add( MaxPool2D(5,5))
model_4.add( Flatten())
model_4.add( Dense(units=3, activation = 'relu' ) )
model_4.add( Dense(units=3, activation = 'softmax' ) )
model_4.summary()
Model: "sequential_1" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= conv2d_3 (Conv2D) (None, 254, 254, 8) 224 _________________________________________________________________ max_pooling2d_3 (MaxPooling2 (None, 50, 50, 8) 0 _________________________________________________________________ conv2d_4 (Conv2D) (None, 48, 48, 4) 292 _________________________________________________________________ max_pooling2d_4 (MaxPooling2 (None, 9, 9, 4) 0 _________________________________________________________________ conv2d_5 (Conv2D) (None, 7, 7, 2) 74 _________________________________________________________________ max_pooling2d_5 (MaxPooling2 (None, 1, 1, 2) 0 _________________________________________________________________ flatten_1 (Flatten) (None, 2) 0 _________________________________________________________________ dense_3 (Dense) (None, 3) 9 _________________________________________________________________ dense_4 (Dense) (None, 3) 12 ================================================================= Total params: 611 Trainable params: 611 Non-trainable params: 0 _________________________________________________________________
model_4.compile( optimizer = 'adam', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
fit_history_4 = model_4.fit( training_data, validation_data = validation_data, epochs = 50, batch_size = 64, callbacks = [callback_earlystp] )
Epoch 1/50 81/81 [==============================] - 27s 320ms/step - loss: 1.1003 - accuracy: 0.2934 - precision_1: 0.0000e+00 - recall_1: 0.0000e+00 - val_loss: 1.0982 - val_accuracy: 0.3086 - val_precision_1: 0.0000e+00 - val_recall_1: 0.0000e+00 Epoch 2/50 81/81 [==============================] - 25s 308ms/step - loss: 1.0945 - accuracy: 0.3557 - precision_1: 0.0000e+00 - recall_1: 0.0000e+00 - val_loss: 1.0989 - val_accuracy: 0.3086 - val_precision_1: 0.0000e+00 - val_recall_1: 0.0000e+00 Epoch 3/50 81/81 [==============================] - 25s 306ms/step - loss: 1.0970 - accuracy: 0.3242 - precision_1: 0.0000e+00 - recall_1: 0.0000e+00 - val_loss: 1.0985 - val_accuracy: 0.3086 - val_precision_1: 0.0000e+00 - val_recall_1: 0.0000e+00 Epoch 4/50 81/81 [==============================] - 25s 306ms/step - loss: 1.0951 - accuracy: 0.3379 - precision_1: 0.0000e+00 - recall_1: 0.0000e+00 - val_loss: 1.0753 - val_accuracy: 0.3086 - val_precision_1: 0.0000e+00 - val_recall_1: 0.0000e+00 Epoch 5/50 81/81 [==============================] - 25s 306ms/step - loss: 1.0552 - accuracy: 0.3967 - precision_1: 0.4081 - recall_1: 0.0052 - val_loss: 1.0368 - val_accuracy: 0.5062 - val_precision_1: 0.8182 - val_recall_1: 0.0370 Epoch 6/50 81/81 [==============================] - 25s 307ms/step - loss: 1.0216 - accuracy: 0.5230 - precision_1: 0.7089 - recall_1: 0.1318 - val_loss: 0.9954 - val_accuracy: 0.5062 - val_precision_1: 0.7273 - val_recall_1: 0.1975 Epoch 7/50 81/81 [==============================] - 25s 308ms/step - loss: 0.9716 - accuracy: 0.5061 - precision_1: 0.7399 - recall_1: 0.2318 - val_loss: 0.9713 - val_accuracy: 0.5185 - val_precision_1: 0.7538 - val_recall_1: 0.2016 Epoch 8/50 81/81 [==============================] - 25s 307ms/step - loss: 0.9375 - accuracy: 0.5724 - precision_1: 0.7804 - recall_1: 0.2538 - val_loss: 0.9536 - val_accuracy: 0.5185 - val_precision_1: 0.7778 - val_recall_1: 0.2016 Epoch 9/50 81/81 [==============================] - 25s 309ms/step - loss: 0.9117 - accuracy: 0.5891 - precision_1: 0.7907 - recall_1: 0.2559 - val_loss: 0.9339 - val_accuracy: 0.5226 - val_precision_1: 0.7681 - val_recall_1: 0.2181 Epoch 10/50 81/81 [==============================] - 25s 306ms/step - loss: 0.9275 - accuracy: 0.5390 - precision_1: 0.8209 - recall_1: 0.2217 - val_loss: 0.9221 - val_accuracy: 0.5267 - val_precision_1: 0.7568 - val_recall_1: 0.2305 Epoch 11/50 81/81 [==============================] - 25s 306ms/step - loss: 0.8884 - accuracy: 0.5591 - precision_1: 0.8130 - recall_1: 0.2542 - val_loss: 0.9180 - val_accuracy: 0.5514 - val_precision_1: 0.6742 - val_recall_1: 0.2469 Epoch 12/50 81/81 [==============================] - 25s 309ms/step - loss: 0.8406 - accuracy: 0.5818 - precision_1: 0.8313 - recall_1: 0.2902 - val_loss: 0.8986 - val_accuracy: 0.5309 - val_precision_1: 0.7407 - val_recall_1: 0.2469 Epoch 13/50 81/81 [==============================] - 25s 308ms/step - loss: 0.8592 - accuracy: 0.5535 - precision_1: 0.8425 - recall_1: 0.2528 - val_loss: 0.8905 - val_accuracy: 0.5350 - val_precision_1: 0.7468 - val_recall_1: 0.2428 Epoch 14/50 81/81 [==============================] - 25s 309ms/step - loss: 0.8172 - accuracy: 0.5854 - precision_1: 0.8343 - recall_1: 0.2863 - val_loss: 0.8892 - val_accuracy: 0.5514 - val_precision_1: 0.6813 - val_recall_1: 0.2551 Epoch 15/50 81/81 [==============================] - 25s 307ms/step - loss: 0.8344 - accuracy: 0.5744 - precision_1: 0.7851 - recall_1: 0.2981 - val_loss: 0.8710 - val_accuracy: 0.5350 - val_precision_1: 0.7500 - val_recall_1: 0.2469 Epoch 16/50 81/81 [==============================] - 25s 305ms/step - loss: 0.8243 - accuracy: 0.5979 - precision_1: 0.8093 - recall_1: 0.2843 - val_loss: 0.8724 - val_accuracy: 0.5473 - val_precision_1: 0.6813 - val_recall_1: 0.2551 Epoch 17/50 81/81 [==============================] - 25s 309ms/step - loss: 0.8003 - accuracy: 0.6065 - precision_1: 0.8217 - recall_1: 0.2827 - val_loss: 0.8523 - val_accuracy: 0.5597 - val_precision_1: 0.7662 - val_recall_1: 0.2428 Epoch 18/50 81/81 [==============================] - 25s 307ms/step - loss: 0.7780 - accuracy: 0.5974 - precision_1: 0.8583 - recall_1: 0.3111 - val_loss: 0.8430 - val_accuracy: 0.5556 - val_precision_1: 0.7692 - val_recall_1: 0.2469 Epoch 19/50 81/81 [==============================] - 25s 300ms/step - loss: 0.8036 - accuracy: 0.5849 - precision_1: 0.8246 - recall_1: 0.2661 - val_loss: 0.8417 - val_accuracy: 0.5473 - val_precision_1: 0.7722 - val_recall_1: 0.2510 Epoch 20/50 81/81 [==============================] - 25s 309ms/step - loss: 0.8040 - accuracy: 0.5891 - precision_1: 0.8206 - recall_1: 0.2924 - val_loss: 0.8385 - val_accuracy: 0.5391 - val_precision_1: 0.7808 - val_recall_1: 0.2346 Epoch 21/50 81/81 [==============================] - 25s 308ms/step - loss: 0.7558 - accuracy: 0.5911 - precision_1: 0.8690 - recall_1: 0.2943 - val_loss: 0.8239 - val_accuracy: 0.5885 - val_precision_1: 0.7722 - val_recall_1: 0.2510 Epoch 22/50 81/81 [==============================] - 25s 308ms/step - loss: 0.7740 - accuracy: 0.6042 - precision_1: 0.8591 - recall_1: 0.2856 - val_loss: 0.8197 - val_accuracy: 0.5556 - val_precision_1: 0.7867 - val_recall_1: 0.2428 Epoch 23/50 81/81 [==============================] - 25s 309ms/step - loss: 0.7758 - accuracy: 0.5745 - precision_1: 0.8597 - recall_1: 0.2690 - val_loss: 0.8498 - val_accuracy: 0.5391 - val_precision_1: 0.7470 - val_recall_1: 0.2551 Epoch 24/50 81/81 [==============================] - 25s 307ms/step - loss: 0.7479 - accuracy: 0.6170 - precision_1: 0.8106 - recall_1: 0.3223 - val_loss: 0.8090 - val_accuracy: 0.5679 - val_precision_1: 0.7973 - val_recall_1: 0.2428 Epoch 25/50 81/81 [==============================] - 25s 308ms/step - loss: 0.7448 - accuracy: 0.6089 - precision_1: 0.8599 - recall_1: 0.2972 - val_loss: 0.8220 - val_accuracy: 0.5638 - val_precision_1: 0.7763 - val_recall_1: 0.2428 Epoch 26/50 81/81 [==============================] - 25s 307ms/step - loss: 0.7312 - accuracy: 0.6207 - precision_1: 0.8699 - recall_1: 0.3101 - val_loss: 0.8049 - val_accuracy: 0.5638 - val_precision_1: 0.7848 - val_recall_1: 0.2551 Epoch 27/50 81/81 [==============================] - 25s 309ms/step - loss: 0.7337 - accuracy: 0.5965 - precision_1: 0.8627 - recall_1: 0.2981 - val_loss: 0.8162 - val_accuracy: 0.5473 - val_precision_1: 0.7848 - val_recall_1: 0.2551 Epoch 28/50 81/81 [==============================] - 25s 308ms/step - loss: 0.7200 - accuracy: 0.5982 - precision_1: 0.8752 - recall_1: 0.3194 - val_loss: 0.8045 - val_accuracy: 0.5638 - val_precision_1: 0.7949 - val_recall_1: 0.2551 Epoch 29/50 81/81 [==============================] - 25s 306ms/step - loss: 0.7136 - accuracy: 0.6107 - precision_1: 0.8638 - recall_1: 0.3211 - val_loss: 0.8237 - val_accuracy: 0.5514 - val_precision_1: 0.7470 - val_recall_1: 0.2551 Epoch 30/50 81/81 [==============================] - 25s 309ms/step - loss: 0.7183 - accuracy: 0.6142 - precision_1: 0.8599 - recall_1: 0.3169 - val_loss: 0.7895 - val_accuracy: 0.5679 - val_precision_1: 0.8243 - val_recall_1: 0.2510 Epoch 31/50 81/81 [==============================] - 25s 307ms/step - loss: 0.7241 - accuracy: 0.5963 - precision_1: 0.8743 - recall_1: 0.2927 - val_loss: 0.7898 - val_accuracy: 0.5556 - val_precision_1: 0.8194 - val_recall_1: 0.2428 Epoch 32/50 81/81 [==============================] - 25s 307ms/step - loss: 0.6950 - accuracy: 0.6407 - precision_1: 0.8551 - recall_1: 0.3197 - val_loss: 0.7864 - val_accuracy: 0.5761 - val_precision_1: 0.8133 - val_recall_1: 0.2510 Epoch 33/50 81/81 [==============================] - 25s 307ms/step - loss: 0.7083 - accuracy: 0.6203 - precision_1: 0.8554 - recall_1: 0.2862 - val_loss: 0.7871 - val_accuracy: 0.5432 - val_precision_1: 0.8529 - val_recall_1: 0.2387 Epoch 34/50 81/81 [==============================] - 25s 309ms/step - loss: 0.7524 - accuracy: 0.5902 - precision_1: 0.8786 - recall_1: 0.2613 - val_loss: 0.8111 - val_accuracy: 0.5597 - val_precision_1: 0.7654 - val_recall_1: 0.2551 Epoch 35/50 81/81 [==============================] - 25s 310ms/step - loss: 0.7129 - accuracy: 0.5970 - precision_1: 0.8823 - recall_1: 0.2988 - val_loss: 0.7899 - val_accuracy: 0.5679 - val_precision_1: 0.8052 - val_recall_1: 0.2551 Epoch 36/50 81/81 [==============================] - 25s 309ms/step - loss: 0.6771 - accuracy: 0.6209 - precision_1: 0.9306 - recall_1: 0.3078 - val_loss: 0.7996 - val_accuracy: 0.5679 - val_precision_1: 0.7778 - val_recall_1: 0.2593 Epoch 37/50 81/81 [==============================] - 25s 310ms/step - loss: 0.7553 - accuracy: 0.6131 - precision_1: 0.8690 - recall_1: 0.2649 - val_loss: 0.8433 - val_accuracy: 0.5514 - val_precision_1: 0.7143 - val_recall_1: 0.2675 Epoch 38/50 81/81 [==============================] - 25s 304ms/step - loss: 0.7489 - accuracy: 0.5830 - precision_1: 0.8797 - recall_1: 0.2895 - val_loss: 0.8460 - val_accuracy: 0.5638 - val_precision_1: 0.7065 - val_recall_1: 0.2675 Epoch 39/50 81/81 [==============================] - 25s 310ms/step - loss: 0.6942 - accuracy: 0.5939 - precision_1: 0.8834 - recall_1: 0.3060 - val_loss: 0.8191 - val_accuracy: 0.5226 - val_precision_1: 0.8947 - val_recall_1: 0.2099 Epoch 40/50 81/81 [==============================] - 25s 312ms/step - loss: 0.7228 - accuracy: 0.6122 - precision_1: 0.9241 - recall_1: 0.3044 - val_loss: 0.7843 - val_accuracy: 0.5802 - val_precision_1: 0.8133 - val_recall_1: 0.2510 Epoch 41/50 81/81 [==============================] - 25s 305ms/step - loss: 0.6771 - accuracy: 0.6143 - precision_1: 0.9117 - recall_1: 0.3259 - val_loss: 0.8097 - val_accuracy: 0.5597 - val_precision_1: 0.7949 - val_recall_1: 0.2551 Epoch 42/50 81/81 [==============================] - 25s 304ms/step - loss: 0.7108 - accuracy: 0.6104 - precision_1: 0.8945 - recall_1: 0.2862 - val_loss: 0.7697 - val_accuracy: 0.5802 - val_precision_1: 0.8551 - val_recall_1: 0.2428 Epoch 43/50 81/81 [==============================] - 25s 307ms/step - loss: 0.7081 - accuracy: 0.6029 - precision_1: 0.9016 - recall_1: 0.2810 - val_loss: 0.7932 - val_accuracy: 0.5720 - val_precision_1: 0.7654 - val_recall_1: 0.2551 Epoch 44/50 81/81 [==============================] - 25s 306ms/step - loss: 0.7278 - accuracy: 0.6025 - precision_1: 0.8664 - recall_1: 0.2945 - val_loss: 0.7909 - val_accuracy: 0.5679 - val_precision_1: 0.7901 - val_recall_1: 0.2634 Epoch 45/50 81/81 [==============================] - 25s 307ms/step - loss: 0.6862 - accuracy: 0.6117 - precision_1: 0.9107 - recall_1: 0.2959 - val_loss: 0.7708 - val_accuracy: 0.5761 - val_precision_1: 0.8657 - val_recall_1: 0.2387 Epoch 46/50 81/81 [==============================] - 25s 306ms/step - loss: 0.7190 - accuracy: 0.5734 - precision_1: 0.8892 - recall_1: 0.2843 - val_loss: 0.7664 - val_accuracy: 0.5967 - val_precision_1: 0.8472 - val_recall_1: 0.2510 Epoch 47/50 81/81 [==============================] - 25s 307ms/step - loss: 0.7157 - accuracy: 0.6233 - precision_1: 0.9279 - recall_1: 0.2848 - val_loss: 0.7659 - val_accuracy: 0.5844 - val_precision_1: 0.8824 - val_recall_1: 0.2469 Epoch 48/50 81/81 [==============================] - 25s 305ms/step - loss: 0.7079 - accuracy: 0.5836 - precision_1: 0.9020 - recall_1: 0.2958 - val_loss: 0.7673 - val_accuracy: 0.5967 - val_precision_1: 0.8657 - val_recall_1: 0.2387 Epoch 49/50 81/81 [==============================] - 25s 306ms/step - loss: 0.7072 - accuracy: 0.6220 - precision_1: 0.9337 - recall_1: 0.2778 - val_loss: 0.7664 - val_accuracy: 0.5967 - val_precision_1: 0.8451 - val_recall_1: 0.2469 Epoch 50/50 81/81 [==============================] - 25s 307ms/step - loss: 0.6680 - accuracy: 0.6279 - precision_1: 0.9232 - recall_1: 0.3189 - val_loss: 0.7924 - val_accuracy: 0.5720 - val_precision_1: 0.7561 - val_recall_1: 0.2551
plt.subplots_adjust(right=1.95, left=.03)
plt.subplot(1,3,1)
plt.plot(fit_history_4.history['accuracy'])
plt.plot(fit_history_4.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(fit_history_4.history['precision_1'])
plt.plot(fit_history_4.history['val_precision_1'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
#plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,3)
plt.plot(fit_history_4.history['recall_1'])
plt.plot(fit_history_4.history['val_recall_1'])
plt.ylabel('Recall')
plt.xlabel('')
#plt.legend(['training','validation'], loc="lower right")
plt.show()
test_loss, test_acc, test_precision, test_recall = model_4.evaluate(testing_data)
print('%s %.2f' % ('validation_acc: ', test_acc*100.0 ))
print('%s %.2f' % ('validation_loss:', test_loss ))
print('%s %.2f' % ('validation_precision:', test_precision ))
print('%s %.2f' % ('validation_recall:', test_recall ))
27/27 [==============================] - 6s 209ms/step - loss: 0.7772 - accuracy: 0.6008 - precision_1: 0.8256 - recall_1: 0.2922 validation_acc: 60.08 validation_loss: 0.78 validation_precision: 0.83 validation_recall: 0.29
We will now increse the number of filters at each level and evaluate performance
model_5 = Sequential()
model_5.add( Conv2D(filters=32, kernel_size=3, activation = 'relu', input_shape = training_data.image_shape ) )
model_5.add( MaxPool2D(5,5))
model_5.add( Conv2D(filters=16, kernel_size=3, activation = 'relu' ) )
model_5.add( MaxPool2D(5,5))
model_5.add( Conv2D(filters=8, kernel_size=3, activation = 'relu' ) )
model_5.add( MaxPool2D(5,5))
model_5.add( Flatten())
model_5.add( Dense(units=10, activation = 'relu' ) )
model_5.add( Dense(units=3, activation = 'softmax' ) )
model_5.summary()
Model: "sequential_2" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= conv2d_6 (Conv2D) (None, 254, 254, 32) 896 _________________________________________________________________ max_pooling2d_6 (MaxPooling2 (None, 50, 50, 32) 0 _________________________________________________________________ conv2d_7 (Conv2D) (None, 48, 48, 16) 4624 _________________________________________________________________ max_pooling2d_7 (MaxPooling2 (None, 9, 9, 16) 0 _________________________________________________________________ conv2d_8 (Conv2D) (None, 7, 7, 8) 1160 _________________________________________________________________ max_pooling2d_8 (MaxPooling2 (None, 1, 1, 8) 0 _________________________________________________________________ flatten_2 (Flatten) (None, 8) 0 _________________________________________________________________ dense_5 (Dense) (None, 10) 90 _________________________________________________________________ dense_6 (Dense) (None, 3) 33 ================================================================= Total params: 6,803 Trainable params: 6,803 Non-trainable params: 0 _________________________________________________________________
model_5.compile( optimizer = 'adam', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
fit_history_5 = model_5.fit( training_data, validation_data = validation_data, epochs = 50, batch_size = 64, callbacks = [callback_earlystp] )
Epoch 1/50 81/81 [==============================] - 26s 309ms/step - loss: 1.1063 - accuracy: 0.3446 - precision_2: 0.0000e+00 - recall_2: 0.0000e+00 - val_loss: 1.0898 - val_accuracy: 0.3416 - val_precision_2: 0.0000e+00 - val_recall_2: 0.0000e+00 Epoch 2/50 81/81 [==============================] - 25s 304ms/step - loss: 1.0746 - accuracy: 0.4440 - precision_2: 0.3915 - recall_2: 0.0053 - val_loss: 0.9879 - val_accuracy: 0.5761 - val_precision_2: 0.7692 - val_recall_2: 0.0412 Epoch 3/50 81/81 [==============================] - 25s 304ms/step - loss: 0.8977 - accuracy: 0.6204 - precision_2: 0.7050 - recall_2: 0.2834 - val_loss: 0.6949 - val_accuracy: 0.7325 - val_precision_2: 0.7911 - val_recall_2: 0.5144 Epoch 4/50 81/81 [==============================] - 25s 306ms/step - loss: 0.6754 - accuracy: 0.6988 - precision_2: 0.7548 - recall_2: 0.5820 - val_loss: 0.6522 - val_accuracy: 0.6749 - val_precision_2: 0.7122 - val_recall_2: 0.6008 Epoch 5/50 81/81 [==============================] - 25s 302ms/step - loss: 0.5486 - accuracy: 0.7703 - precision_2: 0.7837 - recall_2: 0.7142 - val_loss: 0.7090 - val_accuracy: 0.6502 - val_precision_2: 0.6730 - val_recall_2: 0.5844 Epoch 6/50 81/81 [==============================] - 25s 305ms/step - loss: 0.5643 - accuracy: 0.7884 - precision_2: 0.7975 - recall_2: 0.7366 - val_loss: 0.4821 - val_accuracy: 0.8683 - val_precision_2: 0.9128 - val_recall_2: 0.8189 Epoch 7/50 81/81 [==============================] - 25s 306ms/step - loss: 0.4208 - accuracy: 0.8466 - precision_2: 0.8687 - recall_2: 0.8172 - val_loss: 0.5442 - val_accuracy: 0.8066 - val_precision_2: 0.8190 - val_recall_2: 0.7819 Epoch 8/50 81/81 [==============================] - 25s 304ms/step - loss: 0.3703 - accuracy: 0.8776 - precision_2: 0.8997 - recall_2: 0.8481 - val_loss: 0.5533 - val_accuracy: 0.7819 - val_precision_2: 0.8089 - val_recall_2: 0.7490 Epoch 9/50 81/81 [==============================] - 25s 304ms/step - loss: 0.3754 - accuracy: 0.8580 - precision_2: 0.8705 - recall_2: 0.8300 - val_loss: 0.3888 - val_accuracy: 0.8724 - val_precision_2: 0.8936 - val_recall_2: 0.8642 Epoch 10/50 81/81 [==============================] - 25s 303ms/step - loss: 0.2725 - accuracy: 0.9031 - precision_2: 0.9099 - recall_2: 0.9013 - val_loss: 0.3900 - val_accuracy: 0.8642 - val_precision_2: 0.8809 - val_recall_2: 0.8519 Epoch 11/50 81/81 [==============================] - 25s 306ms/step - loss: 0.3291 - accuracy: 0.8754 - precision_2: 0.8884 - recall_2: 0.8622 - val_loss: 0.3540 - val_accuracy: 0.8971 - val_precision_2: 0.9046 - val_recall_2: 0.8971 Epoch 12/50 81/81 [==============================] - 24s 304ms/step - loss: 0.2351 - accuracy: 0.9190 - precision_2: 0.9256 - recall_2: 0.9108 - val_loss: 0.3486 - val_accuracy: 0.8848 - val_precision_2: 0.8979 - val_recall_2: 0.8683 Epoch 13/50 81/81 [==============================] - 25s 307ms/step - loss: 0.2423 - accuracy: 0.9202 - precision_2: 0.9217 - recall_2: 0.9103 - val_loss: 0.3901 - val_accuracy: 0.8807 - val_precision_2: 0.8807 - val_recall_2: 0.8807 Epoch 14/50 81/81 [==============================] - 25s 302ms/step - loss: 0.3201 - accuracy: 0.8918 - precision_2: 0.8949 - recall_2: 0.8821 - val_loss: 0.3139 - val_accuracy: 0.9012 - val_precision_2: 0.9072 - val_recall_2: 0.8848 Epoch 15/50 81/81 [==============================] - 24s 303ms/step - loss: 0.1935 - accuracy: 0.9367 - precision_2: 0.9447 - recall_2: 0.9303 - val_loss: 0.2886 - val_accuracy: 0.9053 - val_precision_2: 0.9125 - val_recall_2: 0.9012 Epoch 16/50 81/81 [==============================] - 25s 306ms/step - loss: 0.2098 - accuracy: 0.9356 - precision_2: 0.9475 - recall_2: 0.9217 - val_loss: 0.2909 - val_accuracy: 0.9053 - val_precision_2: 0.9046 - val_recall_2: 0.8971 Epoch 17/50 81/81 [==============================] - 25s 306ms/step - loss: 0.2025 - accuracy: 0.9213 - precision_2: 0.9280 - recall_2: 0.9161 - val_loss: 0.2966 - val_accuracy: 0.8971 - val_precision_2: 0.9068 - val_recall_2: 0.8807 Epoch 18/50 81/81 [==============================] - 25s 306ms/step - loss: 0.1772 - accuracy: 0.9406 - precision_2: 0.9437 - recall_2: 0.9286 - val_loss: 0.3075 - val_accuracy: 0.8889 - val_precision_2: 0.8963 - val_recall_2: 0.8889 Epoch 19/50 81/81 [==============================] - 25s 303ms/step - loss: 0.2300 - accuracy: 0.9253 - precision_2: 0.9355 - recall_2: 0.9108 - val_loss: 0.3378 - val_accuracy: 0.8889 - val_precision_2: 0.8958 - val_recall_2: 0.8848 Epoch 20/50 81/81 [==============================] - 25s 305ms/step - loss: 0.1703 - accuracy: 0.9373 - precision_2: 0.9443 - recall_2: 0.9293 - val_loss: 0.4115 - val_accuracy: 0.8560 - val_precision_2: 0.8625 - val_recall_2: 0.8519 Epoch 21/50 81/81 [==============================] - 24s 304ms/step - loss: 0.1554 - accuracy: 0.9486 - precision_2: 0.9561 - recall_2: 0.9310 - val_loss: 0.3004 - val_accuracy: 0.8971 - val_precision_2: 0.8967 - val_recall_2: 0.8930 Epoch 22/50 81/81 [==============================] - 24s 303ms/step - loss: 0.1574 - accuracy: 0.9483 - precision_2: 0.9556 - recall_2: 0.9401 - val_loss: 0.2803 - val_accuracy: 0.9095 - val_precision_2: 0.9129 - val_recall_2: 0.9053 Epoch 23/50 81/81 [==============================] - 25s 306ms/step - loss: 0.1682 - accuracy: 0.9428 - precision_2: 0.9447 - recall_2: 0.9308 - val_loss: 0.3177 - val_accuracy: 0.8848 - val_precision_2: 0.8848 - val_recall_2: 0.8848 Epoch 24/50 81/81 [==============================] - 24s 298ms/step - loss: 0.1386 - accuracy: 0.9432 - precision_2: 0.9551 - recall_2: 0.9420 - val_loss: 0.3029 - val_accuracy: 0.9177 - val_precision_2: 0.9174 - val_recall_2: 0.9136 Epoch 25/50 81/81 [==============================] - 25s 304ms/step - loss: 0.1085 - accuracy: 0.9623 - precision_2: 0.9673 - recall_2: 0.9600 - val_loss: 0.2876 - val_accuracy: 0.9012 - val_precision_2: 0.9083 - val_recall_2: 0.8971 Epoch 26/50 81/81 [==============================] - 25s 305ms/step - loss: 0.0802 - accuracy: 0.9849 - precision_2: 0.9874 - recall_2: 0.9812 - val_loss: 0.3460 - val_accuracy: 0.8765 - val_precision_2: 0.8833 - val_recall_2: 0.8724 Epoch 27/50 81/81 [==============================] - 24s 302ms/step - loss: 0.1254 - accuracy: 0.9573 - precision_2: 0.9716 - recall_2: 0.9541 - val_loss: 0.2709 - val_accuracy: 0.9300 - val_precision_2: 0.9295 - val_recall_2: 0.9218 Epoch 28/50 81/81 [==============================] - 24s 304ms/step - loss: 0.0947 - accuracy: 0.9659 - precision_2: 0.9768 - recall_2: 0.9659 - val_loss: 0.2551 - val_accuracy: 0.9342 - val_precision_2: 0.9336 - val_recall_2: 0.9259 Epoch 29/50 81/81 [==============================] - 25s 304ms/step - loss: 0.1071 - accuracy: 0.9592 - precision_2: 0.9632 - recall_2: 0.9547 - val_loss: 0.2905 - val_accuracy: 0.9095 - val_precision_2: 0.9125 - val_recall_2: 0.9012 Epoch 30/50 81/81 [==============================] - 25s 303ms/step - loss: 0.0923 - accuracy: 0.9722 - precision_2: 0.9748 - recall_2: 0.9650 - val_loss: 0.2808 - val_accuracy: 0.9259 - val_precision_2: 0.9298 - val_recall_2: 0.9259 Epoch 31/50 81/81 [==============================] - 24s 300ms/step - loss: 0.0655 - accuracy: 0.9789 - precision_2: 0.9836 - recall_2: 0.9778 - val_loss: 0.3210 - val_accuracy: 0.8971 - val_precision_2: 0.9008 - val_recall_2: 0.8971 Epoch 32/50 81/81 [==============================] - 25s 305ms/step - loss: 0.0828 - accuracy: 0.9807 - precision_2: 0.9824 - recall_2: 0.9768 - val_loss: 0.2872 - val_accuracy: 0.9177 - val_precision_2: 0.9289 - val_recall_2: 0.9136 Epoch 33/50 81/81 [==============================] - 25s 305ms/step - loss: 0.0616 - accuracy: 0.9823 - precision_2: 0.9862 - recall_2: 0.9822 - val_loss: 0.3207 - val_accuracy: 0.9095 - val_precision_2: 0.9091 - val_recall_2: 0.9053 Epoch 34/50 81/81 [==============================] - 24s 304ms/step - loss: 0.0782 - accuracy: 0.9695 - precision_2: 0.9709 - recall_2: 0.9695 - val_loss: 0.4028 - val_accuracy: 0.8848 - val_precision_2: 0.8884 - val_recall_2: 0.8848 Epoch 35/50 81/81 [==============================] - 24s 298ms/step - loss: 0.1452 - accuracy: 0.9431 - precision_2: 0.9472 - recall_2: 0.9416 - val_loss: 0.4573 - val_accuracy: 0.8724 - val_precision_2: 0.8755 - val_recall_2: 0.8683 Epoch 36/50 81/81 [==============================] - 25s 306ms/step - loss: 0.0910 - accuracy: 0.9650 - precision_2: 0.9714 - recall_2: 0.9650 - val_loss: 0.3895 - val_accuracy: 0.8930 - val_precision_2: 0.8967 - val_recall_2: 0.8930 Epoch 37/50 81/81 [==============================] - 25s 304ms/step - loss: 0.0756 - accuracy: 0.9672 - precision_2: 0.9672 - recall_2: 0.9672 - val_loss: 0.2953 - val_accuracy: 0.9259 - val_precision_2: 0.9253 - val_recall_2: 0.9177 Epoch 38/50 81/81 [==============================] - 25s 302ms/step - loss: 0.0424 - accuracy: 0.9944 - precision_2: 0.9945 - recall_2: 0.9870 - val_loss: 0.3195 - val_accuracy: 0.9095 - val_precision_2: 0.9095 - val_recall_2: 0.9095 Epoch 39/50 81/81 [==============================] - 24s 301ms/step - loss: 0.0486 - accuracy: 0.9855 - precision_2: 0.9855 - recall_2: 0.9835 - val_loss: 0.3847 - val_accuracy: 0.8930 - val_precision_2: 0.8930 - val_recall_2: 0.8930 Epoch 40/50 81/81 [==============================] - 25s 306ms/step - loss: 0.0380 - accuracy: 0.9956 - precision_2: 0.9956 - recall_2: 0.9940 - val_loss: 0.3323 - val_accuracy: 0.8930 - val_precision_2: 0.8930 - val_recall_2: 0.8930 Epoch 41/50 81/81 [==============================] - 25s 306ms/step - loss: 0.0672 - accuracy: 0.9772 - precision_2: 0.9784 - recall_2: 0.9744 - val_loss: 0.3102 - val_accuracy: 0.9259 - val_precision_2: 0.9295 - val_recall_2: 0.9218 Epoch 42/50 81/81 [==============================] - 25s 305ms/step - loss: 0.0353 - accuracy: 0.9905 - precision_2: 0.9905 - recall_2: 0.9905 - val_loss: 0.3045 - val_accuracy: 0.9342 - val_precision_2: 0.9339 - val_recall_2: 0.9300 Epoch 43/50 81/81 [==============================] - 25s 304ms/step - loss: 0.0599 - accuracy: 0.9757 - precision_2: 0.9757 - recall_2: 0.9757 - val_loss: 0.3092 - val_accuracy: 0.9095 - val_precision_2: 0.9132 - val_recall_2: 0.9095 Epoch 00043: early stopping
plt.subplots_adjust(right=1.95, left=.03)
plt.subplot(1,3,1)
plt.plot(fit_history_5.history['accuracy'])
plt.plot(fit_history_5.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(fit_history_5.history['precision_2'])
plt.plot(fit_history_5.history['val_precision_2'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
#plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,3)
plt.plot(fit_history_5.history['recall_2'])
plt.plot(fit_history_5.history['val_recall_2'])
plt.ylabel('Recall')
plt.xlabel('')
#plt.legend(['training','validation'], loc="lower right")
plt.show()
test_loss, test_acc, test_precision, test_recall = model_5.evaluate(testing_data)
print('%s %.2f' % ('validation_acc: ', test_acc*100.0 ))
print('%s %.2f' % ('validation_loss:', test_loss ))
print('%s %.2f' % ('validation_precision:', test_precision ))
print('%s %.2f' % ('validation_recall:', test_recall ))
27/27 [==============================] - 6s 211ms/step - loss: 0.4097 - accuracy: 0.8971 - precision_2: 0.8967 - recall_2: 0.8930 validation_acc: 89.71 validation_loss: 0.41 validation_precision: 0.90 validation_recall: 0.89
Now that we have found an optimal model, we will attempt to augment our data and evaluate its performance.
model_1 = Sequential()
model_1.add( Conv2D(filters=16, kernel_size=3, activation = 'relu', input_shape = augmented_training.image_shape ) )
model_1.add( MaxPool2D(5,5))
model_1.add( Conv2D(filters=8, kernel_size=3, activation = 'relu' ) )
model_1.add( MaxPool2D(5,5))
model_1.add( Conv2D(filters=4, kernel_size=3, activation = 'relu' ) )
model_1.add( MaxPool2D(5,5))
model_1.add( Flatten())
model_1.add( Dense(units=5, activation = 'relu' ) )
model_1.add( Dense(units=3, activation = 'softmax' ) )
model_1.summary()
Model: "sequential_2" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= conv2d_3 (Conv2D) (None, 254, 254, 16) 160 _________________________________________________________________ max_pooling2d_3 (MaxPooling2 (None, 50, 50, 16) 0 _________________________________________________________________ conv2d_4 (Conv2D) (None, 48, 48, 8) 1160 _________________________________________________________________ max_pooling2d_4 (MaxPooling2 (None, 9, 9, 8) 0 _________________________________________________________________ conv2d_5 (Conv2D) (None, 7, 7, 4) 292 _________________________________________________________________ max_pooling2d_5 (MaxPooling2 (None, 1, 1, 4) 0 _________________________________________________________________ flatten_1 (Flatten) (None, 4) 0 _________________________________________________________________ dense_2 (Dense) (None, 5) 25 _________________________________________________________________ dense_3 (Dense) (None, 3) 18 ================================================================= Total params: 1,655 Trainable params: 1,655 Non-trainable params: 0 _________________________________________________________________
Augmentation 1
We will first try to shift our images both left/right and up/down, as well as randomly flip some of them horizontally.
augmented_generator = ImageDataGenerator(rescale=1./255, horizontal_flip=True, width_shift_range=0.2, height_shift_range=0.2)
augmented_training = augmented_generator.flow_from_directory( 'bears/training', target_size=(256, 256), batch_size=9, class_mode='categorical')
augmented_validation = augmented_generator.flow_from_directory( 'bears/validation', target_size=(256, 256), batch_size=9, class_mode='categorical')
augmented_testing = augmented_generator.flow_from_directory( 'bears/test', target_size=(256, 256), batch_size=9, class_mode='categorical')
Found 727 images belonging to 3 classes. Found 243 images belonging to 3 classes. Found 243 images belonging to 3 classes.
L = 3
plt.figure(figsize=(20,16))
for my_batch in augmented_training:
images = my_batch[0]
labels = my_batch[1]
for r in range(0,3):
for c in range(0,3):
plt.subplot(L,L,r * L + c + 1)
plt.axis('off')
#plt.title(labels[r * L + c])
plt.imshow(images[r * L + c])
break
model_1.compile( optimizer = 'adam', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
augmented_model_history = model_1.fit( augmented_training, validation_data = augmented_validation, epochs = 50, batch_size = 64, callbacks = [callback_earlystp] )
Epoch 1/50 81/81 [==============================] - 37s 447ms/step - loss: 1.0814 - accuracy: 0.3932 - precision_3: 0.3840 - recall_3: 0.0041 - val_loss: 1.0444 - val_accuracy: 0.4938 - val_precision_3: 0.7556 - val_recall_3: 0.1399 Epoch 2/50 81/81 [==============================] - 35s 436ms/step - loss: 1.0279 - accuracy: 0.5429 - precision_3: 0.7096 - recall_3: 0.1656 - val_loss: 0.9953 - val_accuracy: 0.4938 - val_precision_3: 0.6404 - val_recall_3: 0.2346 Epoch 3/50 81/81 [==============================] - 35s 430ms/step - loss: 0.9914 - accuracy: 0.5901 - precision_3: 0.6325 - recall_3: 0.1937 - val_loss: 0.9491 - val_accuracy: 0.5473 - val_precision_3: 0.6389 - val_recall_3: 0.2840 Epoch 4/50 81/81 [==============================] - 35s 440ms/step - loss: 0.9161 - accuracy: 0.6164 - precision_3: 0.7171 - recall_3: 0.2801 - val_loss: 0.9912 - val_accuracy: 0.5144 - val_precision_3: 0.5270 - val_recall_3: 0.3210 Epoch 5/50 81/81 [==============================] - 35s 435ms/step - loss: 0.8410 - accuracy: 0.6578 - precision_3: 0.7514 - recall_3: 0.3120 - val_loss: 0.8498 - val_accuracy: 0.6872 - val_precision_3: 0.7927 - val_recall_3: 0.2675 Epoch 6/50 81/81 [==============================] - 35s 439ms/step - loss: 0.8564 - accuracy: 0.6451 - precision_3: 0.7928 - recall_3: 0.2699 - val_loss: 0.8521 - val_accuracy: 0.6543 - val_precision_3: 0.7952 - val_recall_3: 0.2716 Epoch 7/50 81/81 [==============================] - 35s 437ms/step - loss: 0.8062 - accuracy: 0.6765 - precision_3: 0.7900 - recall_3: 0.3014 - val_loss: 0.9188 - val_accuracy: 0.5432 - val_precision_3: 0.8696 - val_recall_3: 0.1646 Epoch 8/50 81/81 [==============================] - 35s 434ms/step - loss: 0.8145 - accuracy: 0.6686 - precision_3: 0.8293 - recall_3: 0.2705 - val_loss: 0.9139 - val_accuracy: 0.5103 - val_precision_3: 0.9250 - val_recall_3: 0.1523 Epoch 9/50 81/81 [==============================] - 35s 432ms/step - loss: 0.8356 - accuracy: 0.6216 - precision_3: 0.7917 - recall_3: 0.2506 - val_loss: 0.8102 - val_accuracy: 0.6831 - val_precision_3: 0.8642 - val_recall_3: 0.2881 Epoch 10/50 81/81 [==============================] - 35s 434ms/step - loss: 0.8229 - accuracy: 0.6508 - precision_3: 0.7556 - recall_3: 0.2585 - val_loss: 0.7919 - val_accuracy: 0.6420 - val_precision_3: 0.8049 - val_recall_3: 0.2716 Epoch 11/50 81/81 [==============================] - 35s 436ms/step - loss: 0.7657 - accuracy: 0.6899 - precision_3: 0.7961 - recall_3: 0.2742 - val_loss: 0.8010 - val_accuracy: 0.6831 - val_precision_3: 0.7290 - val_recall_3: 0.3210 Epoch 12/50 81/81 [==============================] - 35s 433ms/step - loss: 0.7874 - accuracy: 0.6783 - precision_3: 0.7942 - recall_3: 0.2811 - val_loss: 0.7743 - val_accuracy: 0.6584 - val_precision_3: 0.8481 - val_recall_3: 0.2757 Epoch 13/50 81/81 [==============================] - 35s 437ms/step - loss: 0.7366 - accuracy: 0.6846 - precision_3: 0.7942 - recall_3: 0.2712 - val_loss: 0.7691 - val_accuracy: 0.7119 - val_precision_3: 0.7935 - val_recall_3: 0.3004 Epoch 14/50 81/81 [==============================] - 35s 432ms/step - loss: 0.7275 - accuracy: 0.6879 - precision_3: 0.8306 - recall_3: 0.2854 - val_loss: 0.7842 - val_accuracy: 0.6543 - val_precision_3: 0.9180 - val_recall_3: 0.2305 Epoch 15/50 81/81 [==============================] - 35s 436ms/step - loss: 0.7344 - accuracy: 0.6954 - precision_3: 0.8572 - recall_3: 0.2946 - val_loss: 0.7450 - val_accuracy: 0.7572 - val_precision_3: 0.7857 - val_recall_3: 0.4979 Epoch 16/50 81/81 [==============================] - 35s 435ms/step - loss: 0.6954 - accuracy: 0.7532 - precision_3: 0.8709 - recall_3: 0.5361 - val_loss: 0.6857 - val_accuracy: 0.8025 - val_precision_3: 0.8849 - val_recall_3: 0.5062 Epoch 17/50 81/81 [==============================] - 35s 432ms/step - loss: 0.7013 - accuracy: 0.7727 - precision_3: 0.8701 - recall_3: 0.5685 - val_loss: 0.6663 - val_accuracy: 0.8148 - val_precision_3: 0.8773 - val_recall_3: 0.5885 Epoch 18/50 81/81 [==============================] - 35s 436ms/step - loss: 0.6386 - accuracy: 0.8168 - precision_3: 0.8922 - recall_3: 0.6375 - val_loss: 0.6301 - val_accuracy: 0.8189 - val_precision_3: 0.8844 - val_recall_3: 0.7243 Epoch 19/50 81/81 [==============================] - 35s 432ms/step - loss: 0.6139 - accuracy: 0.8341 - precision_3: 0.8575 - recall_3: 0.7496 - val_loss: 0.5720 - val_accuracy: 0.8436 - val_precision_3: 0.8773 - val_recall_3: 0.7942 Epoch 20/50 81/81 [==============================] - 35s 436ms/step - loss: 0.5652 - accuracy: 0.8642 - precision_3: 0.9042 - recall_3: 0.7871 - val_loss: 0.5297 - val_accuracy: 0.8395 - val_precision_3: 0.8678 - val_recall_3: 0.8107 Epoch 21/50 81/81 [==============================] - 35s 435ms/step - loss: 0.5587 - accuracy: 0.8443 - precision_3: 0.8668 - recall_3: 0.8072 - val_loss: 0.5436 - val_accuracy: 0.8272 - val_precision_3: 0.8502 - val_recall_3: 0.7942 Epoch 22/50 81/81 [==============================] - 35s 436ms/step - loss: 0.5062 - accuracy: 0.8179 - precision_3: 0.8504 - recall_3: 0.7807 - val_loss: 0.5773 - val_accuracy: 0.8025 - val_precision_3: 0.8106 - val_recall_3: 0.7572 Epoch 23/50 81/81 [==============================] - 35s 432ms/step - loss: 0.5197 - accuracy: 0.8202 - precision_3: 0.8369 - recall_3: 0.7829 - val_loss: 0.4340 - val_accuracy: 0.8601 - val_precision_3: 0.8865 - val_recall_3: 0.8354 Epoch 24/50 81/81 [==============================] - 35s 432ms/step - loss: 0.5131 - accuracy: 0.8303 - precision_3: 0.8546 - recall_3: 0.7990 - val_loss: 0.5349 - val_accuracy: 0.7984 - val_precision_3: 0.8052 - val_recall_3: 0.7654 Epoch 25/50 81/81 [==============================] - 35s 436ms/step - loss: 0.4681 - accuracy: 0.8278 - precision_3: 0.8619 - recall_3: 0.8044 - val_loss: 0.5589 - val_accuracy: 0.8230 - val_precision_3: 0.8478 - val_recall_3: 0.8025 Epoch 26/50 81/81 [==============================] - 35s 435ms/step - loss: 0.4593 - accuracy: 0.8428 - precision_3: 0.8733 - recall_3: 0.8322 - val_loss: 0.4087 - val_accuracy: 0.8724 - val_precision_3: 0.8782 - val_recall_3: 0.8601 Epoch 27/50 81/81 [==============================] - 35s 431ms/step - loss: 0.3817 - accuracy: 0.8687 - precision_3: 0.8840 - recall_3: 0.8511 - val_loss: 0.4235 - val_accuracy: 0.8724 - val_precision_3: 0.8894 - val_recall_3: 0.8601 Epoch 28/50 81/81 [==============================] - 35s 425ms/step - loss: 0.3885 - accuracy: 0.8707 - precision_3: 0.8847 - recall_3: 0.8471 - val_loss: 0.4085 - val_accuracy: 0.8930 - val_precision_3: 0.8945 - val_recall_3: 0.8724 Epoch 29/50 81/81 [==============================] - 35s 436ms/step - loss: 0.3401 - accuracy: 0.8908 - precision_3: 0.9049 - recall_3: 0.8665 - val_loss: 0.4570 - val_accuracy: 0.8354 - val_precision_3: 0.8609 - val_recall_3: 0.8148 Epoch 30/50 81/81 [==============================] - 35s 439ms/step - loss: 0.3923 - accuracy: 0.8568 - precision_3: 0.8709 - recall_3: 0.8386 - val_loss: 0.4321 - val_accuracy: 0.8601 - val_precision_3: 0.8798 - val_recall_3: 0.8436 Epoch 31/50 81/81 [==============================] - 35s 439ms/step - loss: 0.3878 - accuracy: 0.8708 - precision_3: 0.8875 - recall_3: 0.8551 - val_loss: 0.3547 - val_accuracy: 0.8971 - val_precision_3: 0.9076 - val_recall_3: 0.8889 Epoch 32/50 81/81 [==============================] - 35s 435ms/step - loss: 0.3709 - accuracy: 0.8756 - precision_3: 0.8944 - recall_3: 0.8582 - val_loss: 0.4380 - val_accuracy: 0.8519 - val_precision_3: 0.8686 - val_recall_3: 0.8436 Epoch 33/50 81/81 [==============================] - 35s 438ms/step - loss: 0.3527 - accuracy: 0.8606 - precision_3: 0.8750 - recall_3: 0.8495 - val_loss: 0.4003 - val_accuracy: 0.8477 - val_precision_3: 0.8613 - val_recall_3: 0.8436 Epoch 34/50 81/81 [==============================] - 36s 439ms/step - loss: 0.4271 - accuracy: 0.8492 - precision_3: 0.8813 - recall_3: 0.8390 - val_loss: 0.3970 - val_accuracy: 0.8642 - val_precision_3: 0.8874 - val_recall_3: 0.8436 Epoch 35/50 81/81 [==============================] - 35s 436ms/step - loss: 0.3372 - accuracy: 0.8867 - precision_3: 0.8980 - recall_3: 0.8724 - val_loss: 0.3237 - val_accuracy: 0.8971 - val_precision_3: 0.8996 - val_recall_3: 0.8848 Epoch 36/50 81/81 [==============================] - 35s 437ms/step - loss: 0.3187 - accuracy: 0.9047 - precision_3: 0.9190 - recall_3: 0.8937 - val_loss: 0.3884 - val_accuracy: 0.8683 - val_precision_3: 0.8729 - val_recall_3: 0.8477 Epoch 37/50 81/81 [==============================] - 35s 435ms/step - loss: 0.3522 - accuracy: 0.8723 - precision_3: 0.8902 - recall_3: 0.8591 - val_loss: 0.2850 - val_accuracy: 0.9053 - val_precision_3: 0.9087 - val_recall_3: 0.9012 Epoch 38/50 81/81 [==============================] - 35s 437ms/step - loss: 0.3298 - accuracy: 0.8909 - precision_3: 0.9080 - recall_3: 0.8710 - val_loss: 0.2891 - val_accuracy: 0.9053 - val_precision_3: 0.9195 - val_recall_3: 0.8930 Epoch 39/50 81/81 [==============================] - 35s 436ms/step - loss: 0.3398 - accuracy: 0.8863 - precision_3: 0.8990 - recall_3: 0.8634 - val_loss: 0.3437 - val_accuracy: 0.9012 - val_precision_3: 0.9188 - val_recall_3: 0.8848 Epoch 40/50 81/81 [==============================] - 35s 434ms/step - loss: 0.2853 - accuracy: 0.8966 - precision_3: 0.9111 - recall_3: 0.8854 - val_loss: 0.5297 - val_accuracy: 0.7901 - val_precision_3: 0.8128 - val_recall_3: 0.7860 Epoch 41/50 81/81 [==============================] - 35s 440ms/step - loss: 0.3632 - accuracy: 0.8649 - precision_3: 0.8790 - recall_3: 0.8473 - val_loss: 0.4456 - val_accuracy: 0.8066 - val_precision_3: 0.8248 - val_recall_3: 0.7942 Epoch 42/50 81/81 [==============================] - 35s 438ms/step - loss: 0.3512 - accuracy: 0.8644 - precision_3: 0.8726 - recall_3: 0.8528 - val_loss: 0.3132 - val_accuracy: 0.8930 - val_precision_3: 0.9060 - val_recall_3: 0.8724 Epoch 43/50 81/81 [==============================] - 35s 440ms/step - loss: 0.3094 - accuracy: 0.9003 - precision_3: 0.9156 - recall_3: 0.8862 - val_loss: 0.3126 - val_accuracy: 0.8889 - val_precision_3: 0.8950 - val_recall_3: 0.8765 Epoch 44/50 81/81 [==============================] - 35s 435ms/step - loss: 0.3008 - accuracy: 0.9076 - precision_3: 0.9326 - recall_3: 0.8927 - val_loss: 0.3294 - val_accuracy: 0.8930 - val_precision_3: 0.9068 - val_recall_3: 0.8807 Epoch 45/50 81/81 [==============================] - 35s 439ms/step - loss: 0.3493 - accuracy: 0.8736 - precision_3: 0.8842 - recall_3: 0.8566 - val_loss: 0.2756 - val_accuracy: 0.8889 - val_precision_3: 0.9072 - val_recall_3: 0.8848 Epoch 46/50 81/81 [==============================] - 35s 438ms/step - loss: 0.3560 - accuracy: 0.8747 - precision_3: 0.8835 - recall_3: 0.8644 - val_loss: 0.3254 - val_accuracy: 0.9012 - val_precision_3: 0.9046 - val_recall_3: 0.8971 Epoch 47/50 81/81 [==============================] - 35s 436ms/step - loss: 0.3237 - accuracy: 0.8733 - precision_3: 0.8834 - recall_3: 0.8620 - val_loss: 0.2836 - val_accuracy: 0.9012 - val_precision_3: 0.9087 - val_recall_3: 0.9012 Epoch 48/50 81/81 [==============================] - 36s 442ms/step - loss: 0.2633 - accuracy: 0.9134 - precision_3: 0.9261 - recall_3: 0.8986 - val_loss: 0.3280 - val_accuracy: 0.8930 - val_precision_3: 0.8950 - val_recall_3: 0.8765 Epoch 49/50 81/81 [==============================] - 36s 441ms/step - loss: 0.3144 - accuracy: 0.8919 - precision_3: 0.9054 - recall_3: 0.8794 - val_loss: 0.3508 - val_accuracy: 0.8724 - val_precision_3: 0.8797 - val_recall_3: 0.8724 Epoch 50/50 81/81 [==============================] - 36s 442ms/step - loss: 0.2763 - accuracy: 0.8997 - precision_3: 0.9192 - recall_3: 0.8923 - val_loss: 0.2882 - val_accuracy: 0.8971 - val_precision_3: 0.9042 - val_recall_3: 0.8930
plt.subplots_adjust(right=1.95, left=.03)
plt.subplot(1,3,1)
plt.plot(augmented_model_history.history['accuracy'])
plt.plot(augmented_model_history.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(augmented_model_history.history['precision_3'])
plt.plot(augmented_model_history.history['val_precision_3'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
#plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,3)
plt.plot(augmented_model_history.history['recall_3'])
plt.plot(augmented_model_history.history['val_recall_3'])
plt.ylabel('Recall')
plt.xlabel('')
#plt.legend(['training','validation'], loc="lower right")
plt.show()
test_loss, test_acc, test_precision, test_recall = model_1.evaluate(augmented_testing)
print('%s %.2f' % ('validation_acc: ', test_acc*100.0 ))
print('%s %.2f' % ('validation_loss:', test_loss ))
print('%s %.2f' % ('validation_precision:', test_precision ))
print('%s %.2f' % ('validation_recall:', test_recall ))
27/27 [==============================] - 8s 316ms/step - loss: 0.3053 - accuracy: 0.9012 - precision_3: 0.9106 - recall_3: 0.8807 validation_acc: 90.12 validation_loss: 0.31 validation_precision: 0.91 validation_recall: 0.88
del model_1
Augmentation 2
We will next add a zoom to our images.
augmented_generator = ImageDataGenerator(rescale=1./255, zoom_range=0.2, horizontal_flip=True, width_shift_range=0.2, height_shift_range=0.2)
augmented_training = augmented_generator.flow_from_directory( 'bears/training', target_size=(256, 256), batch_size=9, class_mode='categorical')
augmented_validation = augmented_generator.flow_from_directory( 'bears/validation', target_size=(256, 256), batch_size=9, class_mode='categorical')
augmented_testing = augmented_generator.flow_from_directory( 'bears/test', target_size=(256, 256), batch_size=9, class_mode='categorical')
Found 727 images belonging to 3 classes. Found 243 images belonging to 3 classes. Found 243 images belonging to 3 classes.
model_1.compile( optimizer = 'adam', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
augmented_model_history = model_1.fit( augmented_training, validation_data = augmented_validation, epochs = 50, batch_size = 64, callbacks = [callback_earlystp] )
Epoch 1/50 81/81 [==============================] - 37s 450ms/step - loss: 1.1129 - accuracy: 0.3492 - precision_4: 0.3119 - recall_4: 0.0755 - val_loss: 1.0863 - val_accuracy: 0.4733 - val_precision_4: 0.0000e+00 - val_recall_4: 0.0000e+00 Epoch 2/50 81/81 [==============================] - 36s 441ms/step - loss: 1.0837 - accuracy: 0.4105 - precision_4: 0.3415 - recall_4: 5.6340e-04 - val_loss: 1.0720 - val_accuracy: 0.3374 - val_precision_4: 0.0000e+00 - val_recall_4: 0.0000e+00 Epoch 3/50 81/81 [==============================] - 35s 440ms/step - loss: 1.0648 - accuracy: 0.3505 - precision_4: 0.7476 - recall_4: 0.0248 - val_loss: 1.0381 - val_accuracy: 0.3539 - val_precision_4: 0.7778 - val_recall_4: 0.1440 Epoch 4/50 81/81 [==============================] - 36s 441ms/step - loss: 1.0268 - accuracy: 0.4109 - precision_4: 0.8851 - recall_4: 0.1254 - val_loss: 0.9847 - val_accuracy: 0.5638 - val_precision_4: 0.8511 - val_recall_4: 0.1646 Epoch 5/50 81/81 [==============================] - 36s 442ms/step - loss: 0.9782 - accuracy: 0.5027 - precision_4: 0.7839 - recall_4: 0.1958 - val_loss: 0.9471 - val_accuracy: 0.5556 - val_precision_4: 0.9000 - val_recall_4: 0.2222 Epoch 6/50 81/81 [==============================] - 36s 439ms/step - loss: 0.9483 - accuracy: 0.4878 - precision_4: 0.7461 - recall_4: 0.2027 - val_loss: 0.9055 - val_accuracy: 0.5185 - val_precision_4: 0.8046 - val_recall_4: 0.2881 Epoch 7/50 81/81 [==============================] - 35s 430ms/step - loss: 0.9067 - accuracy: 0.5092 - precision_4: 0.7871 - recall_4: 0.2553 - val_loss: 0.8570 - val_accuracy: 0.6091 - val_precision_4: 0.8649 - val_recall_4: 0.2634 Epoch 8/50 81/81 [==============================] - 36s 437ms/step - loss: 0.8860 - accuracy: 0.5553 - precision_4: 0.8068 - recall_4: 0.2503 - val_loss: 0.8540 - val_accuracy: 0.6461 - val_precision_4: 0.9219 - val_recall_4: 0.2428 Epoch 9/50 81/81 [==============================] - 36s 441ms/step - loss: 0.8491 - accuracy: 0.6003 - precision_4: 0.8340 - recall_4: 0.2530 - val_loss: 0.8660 - val_accuracy: 0.5556 - val_precision_4: 0.9423 - val_recall_4: 0.2016 Epoch 10/50 81/81 [==============================] - 36s 440ms/step - loss: 0.8290 - accuracy: 0.5944 - precision_4: 0.8343 - recall_4: 0.2594 - val_loss: 0.7941 - val_accuracy: 0.6132 - val_precision_4: 0.8846 - val_recall_4: 0.2840 Epoch 11/50 81/81 [==============================] - 35s 439ms/step - loss: 0.8056 - accuracy: 0.6234 - precision_4: 0.8479 - recall_4: 0.2496 - val_loss: 0.8018 - val_accuracy: 0.6543 - val_precision_4: 0.9254 - val_recall_4: 0.2551 Epoch 12/50 81/81 [==============================] - 35s 438ms/step - loss: 0.7538 - accuracy: 0.6712 - precision_4: 0.8930 - recall_4: 0.2846 - val_loss: 0.7980 - val_accuracy: 0.6008 - val_precision_4: 0.9104 - val_recall_4: 0.2510 Epoch 13/50 81/81 [==============================] - 36s 441ms/step - loss: 0.7848 - accuracy: 0.6333 - precision_4: 0.8586 - recall_4: 0.2688 - val_loss: 0.8206 - val_accuracy: 0.5638 - val_precision_4: 0.9333 - val_recall_4: 0.2305 Epoch 14/50 81/81 [==============================] - 35s 439ms/step - loss: 0.7596 - accuracy: 0.6323 - precision_4: 0.9099 - recall_4: 0.2742 - val_loss: 0.7157 - val_accuracy: 0.6502 - val_precision_4: 0.9103 - val_recall_4: 0.2922 Epoch 15/50 81/81 [==============================] - 35s 438ms/step - loss: 0.7534 - accuracy: 0.6632 - precision_4: 0.8947 - recall_4: 0.2700 - val_loss: 0.7649 - val_accuracy: 0.6955 - val_precision_4: 0.7629 - val_recall_4: 0.3045 Epoch 16/50 81/81 [==============================] - 35s 439ms/step - loss: 0.7352 - accuracy: 0.6773 - precision_4: 0.8344 - recall_4: 0.2814 - val_loss: 0.7107 - val_accuracy: 0.6708 - val_precision_4: 0.8706 - val_recall_4: 0.3045 Epoch 17/50 81/81 [==============================] - 36s 441ms/step - loss: 0.7297 - accuracy: 0.6673 - precision_4: 0.9065 - recall_4: 0.2786 - val_loss: 0.7700 - val_accuracy: 0.5802 - val_precision_4: 0.9649 - val_recall_4: 0.2263 Epoch 18/50 81/81 [==============================] - 36s 442ms/step - loss: 0.7415 - accuracy: 0.6387 - precision_4: 0.8812 - recall_4: 0.2545 - val_loss: 0.7584 - val_accuracy: 0.6502 - val_precision_4: 0.7708 - val_recall_4: 0.3045 Epoch 19/50 81/81 [==============================] - 35s 439ms/step - loss: 0.7034 - accuracy: 0.6858 - precision_4: 0.8597 - recall_4: 0.2894 - val_loss: 0.6980 - val_accuracy: 0.7490 - val_precision_4: 0.8690 - val_recall_4: 0.3004 Epoch 20/50 81/81 [==============================] - 35s 436ms/step - loss: 0.6961 - accuracy: 0.7306 - precision_4: 0.9133 - recall_4: 0.2724 - val_loss: 0.7083 - val_accuracy: 0.6749 - val_precision_4: 0.8140 - val_recall_4: 0.2881 Epoch 21/50 81/81 [==============================] - 36s 441ms/step - loss: 0.6849 - accuracy: 0.7237 - precision_4: 0.8184 - recall_4: 0.4854 - val_loss: 0.6256 - val_accuracy: 0.8189 - val_precision_4: 0.8800 - val_recall_4: 0.6337 Epoch 22/50 81/81 [==============================] - 35s 440ms/step - loss: 0.6446 - accuracy: 0.7700 - precision_4: 0.7986 - recall_4: 0.6701 - val_loss: 0.6546 - val_accuracy: 0.7572 - val_precision_4: 0.7864 - val_recall_4: 0.6667 Epoch 23/50 81/81 [==============================] - 35s 440ms/step - loss: 0.4835 - accuracy: 0.8242 - precision_4: 0.8614 - recall_4: 0.7665 - val_loss: 0.5401 - val_accuracy: 0.8189 - val_precision_4: 0.8411 - val_recall_4: 0.7407 Epoch 24/50 81/81 [==============================] - 36s 439ms/step - loss: 0.4561 - accuracy: 0.8319 - precision_4: 0.8691 - recall_4: 0.7905 - val_loss: 0.5151 - val_accuracy: 0.7860 - val_precision_4: 0.8267 - val_recall_4: 0.7654 Epoch 25/50 81/81 [==============================] - 35s 439ms/step - loss: 0.4948 - accuracy: 0.8422 - precision_4: 0.8538 - recall_4: 0.7966 - val_loss: 0.5157 - val_accuracy: 0.8724 - val_precision_4: 0.8957 - val_recall_4: 0.8477 Epoch 26/50 81/81 [==============================] - 36s 440ms/step - loss: 0.4143 - accuracy: 0.8502 - precision_4: 0.8670 - recall_4: 0.8236 - val_loss: 0.4809 - val_accuracy: 0.8066 - val_precision_4: 0.8261 - val_recall_4: 0.7819 Epoch 27/50 81/81 [==============================] - 36s 445ms/step - loss: 0.4113 - accuracy: 0.8585 - precision_4: 0.8762 - recall_4: 0.8306 - val_loss: 0.4605 - val_accuracy: 0.8313 - val_precision_4: 0.8640 - val_recall_4: 0.8107 Epoch 28/50 81/81 [==============================] - 36s 440ms/step - loss: 0.3631 - accuracy: 0.8590 - precision_4: 0.8795 - recall_4: 0.8446 - val_loss: 0.4625 - val_accuracy: 0.8560 - val_precision_4: 0.8745 - val_recall_4: 0.8313 Epoch 29/50 81/81 [==============================] - 35s 437ms/step - loss: 0.2926 - accuracy: 0.8943 - precision_4: 0.9093 - recall_4: 0.8903 - val_loss: 0.4098 - val_accuracy: 0.8395 - val_precision_4: 0.8584 - val_recall_4: 0.8230 Epoch 30/50 81/81 [==============================] - 36s 443ms/step - loss: 0.3705 - accuracy: 0.8458 - precision_4: 0.8640 - recall_4: 0.8225 - val_loss: 0.3617 - val_accuracy: 0.8971 - val_precision_4: 0.9038 - val_recall_4: 0.8889 Epoch 31/50 81/81 [==============================] - 37s 454ms/step - loss: 0.3969 - accuracy: 0.8264 - precision_4: 0.8427 - recall_4: 0.8105 - val_loss: 0.3583 - val_accuracy: 0.8519 - val_precision_4: 0.8692 - val_recall_4: 0.8477 Epoch 32/50 81/81 [==============================] - 37s 451ms/step - loss: 0.2812 - accuracy: 0.9201 - precision_4: 0.9264 - recall_4: 0.8959 - val_loss: 0.3404 - val_accuracy: 0.8807 - val_precision_4: 0.8866 - val_recall_4: 0.8683 Epoch 33/50 81/81 [==============================] - 36s 442ms/step - loss: 0.2967 - accuracy: 0.8887 - precision_4: 0.9055 - recall_4: 0.8758 - val_loss: 0.3678 - val_accuracy: 0.8683 - val_precision_4: 0.8856 - val_recall_4: 0.8601 Epoch 34/50 81/81 [==============================] - 36s 441ms/step - loss: 0.3126 - accuracy: 0.8828 - precision_4: 0.8938 - recall_4: 0.8631 - val_loss: 0.3538 - val_accuracy: 0.8560 - val_precision_4: 0.8771 - val_recall_4: 0.8519 Epoch 35/50 81/81 [==============================] - 36s 442ms/step - loss: 0.3025 - accuracy: 0.8953 - precision_4: 0.8995 - recall_4: 0.8788 - val_loss: 0.3737 - val_accuracy: 0.8477 - val_precision_4: 0.8619 - val_recall_4: 0.8477 Epoch 36/50 81/81 [==============================] - 36s 445ms/step - loss: 0.2784 - accuracy: 0.9047 - precision_4: 0.9169 - recall_4: 0.8801 - val_loss: 0.3280 - val_accuracy: 0.8889 - val_precision_4: 0.9034 - val_recall_4: 0.8848 Epoch 37/50 81/81 [==============================] - 36s 443ms/step - loss: 0.2980 - accuracy: 0.9021 - precision_4: 0.9031 - recall_4: 0.8866 - val_loss: 0.4033 - val_accuracy: 0.8519 - val_precision_4: 0.8675 - val_recall_4: 0.8354 Epoch 38/50 81/81 [==============================] - 36s 445ms/step - loss: 0.2469 - accuracy: 0.9020 - precision_4: 0.9112 - recall_4: 0.8895 - val_loss: 0.3542 - val_accuracy: 0.8848 - val_precision_4: 0.8979 - val_recall_4: 0.8683 Epoch 39/50 81/81 [==============================] - 36s 444ms/step - loss: 0.2501 - accuracy: 0.8986 - precision_4: 0.9090 - recall_4: 0.8923 - val_loss: 0.3118 - val_accuracy: 0.8848 - val_precision_4: 0.8950 - val_recall_4: 0.8765 Epoch 40/50 81/81 [==============================] - 36s 449ms/step - loss: 0.2226 - accuracy: 0.9268 - precision_4: 0.9330 - recall_4: 0.9131 - val_loss: 0.3317 - val_accuracy: 0.8848 - val_precision_4: 0.8917 - val_recall_4: 0.8807 Epoch 41/50 81/81 [==============================] - 37s 460ms/step - loss: 0.2682 - accuracy: 0.9010 - precision_4: 0.9102 - recall_4: 0.8778 - val_loss: 0.3195 - val_accuracy: 0.8971 - val_precision_4: 0.9038 - val_recall_4: 0.8889 Epoch 42/50 81/81 [==============================] - 36s 446ms/step - loss: 0.2053 - accuracy: 0.9296 - precision_4: 0.9321 - recall_4: 0.9262 - val_loss: 0.2896 - val_accuracy: 0.9095 - val_precision_4: 0.9153 - val_recall_4: 0.8889 Epoch 43/50 81/81 [==============================] - 36s 447ms/step - loss: 0.2934 - accuracy: 0.8930 - precision_4: 0.9047 - recall_4: 0.8880 - val_loss: 0.3873 - val_accuracy: 0.8642 - val_precision_4: 0.8678 - val_recall_4: 0.8642 Epoch 44/50 81/81 [==============================] - 36s 449ms/step - loss: 0.2099 - accuracy: 0.9377 - precision_4: 0.9393 - recall_4: 0.9317 - val_loss: 0.4173 - val_accuracy: 0.8354 - val_precision_4: 0.8432 - val_recall_4: 0.8189 Epoch 45/50 81/81 [==============================] - 36s 444ms/step - loss: 0.2823 - accuracy: 0.8985 - precision_4: 0.9047 - recall_4: 0.8849 - val_loss: 0.3132 - val_accuracy: 0.9095 - val_precision_4: 0.9156 - val_recall_4: 0.8930 Epoch 46/50 81/81 [==============================] - 36s 446ms/step - loss: 0.2514 - accuracy: 0.8967 - precision_4: 0.9025 - recall_4: 0.8948 - val_loss: 0.2775 - val_accuracy: 0.9012 - val_precision_4: 0.9072 - val_recall_4: 0.8848 Epoch 47/50 81/81 [==============================] - 35s 440ms/step - loss: 0.2597 - accuracy: 0.9265 - precision_4: 0.9343 - recall_4: 0.9209 - val_loss: 0.2437 - val_accuracy: 0.9218 - val_precision_4: 0.9256 - val_recall_4: 0.9218 Epoch 48/50 81/81 [==============================] - 36s 442ms/step - loss: 0.2147 - accuracy: 0.9207 - precision_4: 0.9373 - recall_4: 0.9149 - val_loss: 0.2965 - val_accuracy: 0.9136 - val_precision_4: 0.9205 - val_recall_4: 0.9053 Epoch 49/50 81/81 [==============================] - 36s 444ms/step - loss: 0.2753 - accuracy: 0.9021 - precision_4: 0.9104 - recall_4: 0.8998 - val_loss: 0.2779 - val_accuracy: 0.9177 - val_precision_4: 0.9289 - val_recall_4: 0.9136 Epoch 50/50 81/81 [==============================] - 36s 442ms/step - loss: 0.1903 - accuracy: 0.9205 - precision_4: 0.9353 - recall_4: 0.9129 - val_loss: 0.2693 - val_accuracy: 0.9012 - val_precision_4: 0.9083 - val_recall_4: 0.8971
plt.subplots_adjust(right=1.95, left=.03)
plt.subplot(1,3,1)
plt.plot(augmented_model_history.history['accuracy'])
plt.plot(augmented_model_history.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(augmented_model_history.history['precision_4'])
plt.plot(augmented_model_history.history['val_precision_4'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
#plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,3)
plt.plot(augmented_model_history.history['recall_4'])
plt.plot(augmented_model_history.history['val_recall_4'])
plt.ylabel('Recall')
plt.xlabel('')
#plt.legend(['training','validation'], loc="lower right")
plt.show()
test_loss, test_acc, test_precision, test_recall = model_1.evaluate(augmented_testing)
print('%s %.2f' % ('validation_acc: ', test_acc*100.0 ))
print('%s %.2f' % ('validation_loss:', test_loss ))
print('%s %.2f' % ('validation_precision:', test_precision ))
print('%s %.2f' % ('validation_recall:', test_recall ))
27/27 [==============================] - 8s 308ms/step - loss: 0.2733 - accuracy: 0.8971 - precision_4: 0.9046 - recall_4: 0.8971 validation_acc: 89.71 validation_loss: 0.27 validation_precision: 0.90 validation_recall: 0.90
del model_1
Augmentation 3
We will next add a rotation to our images (by 10 degrees).
augmented_generator = ImageDataGenerator(rescale=1./255, rotation_range=10, zoom_range=0.2, horizontal_flip=True, width_shift_range=0.2, height_shift_range=0.2)
augmented_training = augmented_generator.flow_from_directory( 'bears/training', target_size=(256, 256), batch_size=9, class_mode='categorical')
augmented_validation = augmented_generator.flow_from_directory( 'bears/validation', target_size=(256, 256), batch_size=9, class_mode='categorical')
augmented_testing = augmented_generator.flow_from_directory( 'bears/test', target_size=(256, 256), batch_size=9, class_mode='categorical')
Found 727 images belonging to 3 classes. Found 243 images belonging to 3 classes. Found 243 images belonging to 3 classes.
L = 3
plt.figure(figsize=(20,16))
for my_batch in augmented_training:
images = my_batch[0]
labels = my_batch[1]
for r in range(0,3):
for c in range(0,3):
plt.subplot(L,L,r * L + c + 1)
plt.axis('off')
#plt.title(labels[r * L + c])
plt.imshow(images[r * L + c])
break
model_1.compile( optimizer = 'adam', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
augmented_model_history = model_1.fit( augmented_training, validation_data = augmented_validation, epochs = 50, batch_size = 64, callbacks = [callback_earlystp] )
Epoch 1/50 81/81 [==============================] - 70s 470ms/step - loss: 1.0595 - accuracy: 0.4199 - precision: 0.2644 - recall: 0.0022 - val_loss: 0.9714 - val_accuracy: 0.5967 - val_precision: 1.0000 - val_recall: 0.0123 Epoch 2/50 81/81 [==============================] - 36s 452ms/step - loss: 0.9392 - accuracy: 0.5791 - precision: 0.7247 - recall: 0.1209 - val_loss: 0.8595 - val_accuracy: 0.5885 - val_precision: 0.7214 - val_recall: 0.4156 Epoch 3/50 81/81 [==============================] - 35s 441ms/step - loss: 0.7398 - accuracy: 0.6825 - precision: 0.7858 - recall: 0.5292 - val_loss: 0.7856 - val_accuracy: 0.6173 - val_precision: 0.6957 - val_recall: 0.5267 Epoch 4/50 81/81 [==============================] - 36s 447ms/step - loss: 0.7397 - accuracy: 0.6690 - precision: 0.7536 - recall: 0.5772 - val_loss: 0.7579 - val_accuracy: 0.6790 - val_precision: 0.7660 - val_recall: 0.5926 Epoch 5/50 81/81 [==============================] - 36s 447ms/step - loss: 0.6580 - accuracy: 0.7313 - precision: 0.8001 - recall: 0.6497 - val_loss: 0.6684 - val_accuracy: 0.7284 - val_precision: 0.8150 - val_recall: 0.6708 Epoch 6/50 81/81 [==============================] - 36s 441ms/step - loss: 0.6490 - accuracy: 0.7514 - precision: 0.8115 - recall: 0.6851 - val_loss: 0.6461 - val_accuracy: 0.7284 - val_precision: 0.7860 - val_recall: 0.6955 Epoch 7/50 81/81 [==============================] - 35s 436ms/step - loss: 0.6107 - accuracy: 0.7409 - precision: 0.7911 - recall: 0.6807 - val_loss: 0.6747 - val_accuracy: 0.7819 - val_precision: 0.8139 - val_recall: 0.7737 Epoch 8/50 81/81 [==============================] - 35s 439ms/step - loss: 0.6190 - accuracy: 0.7486 - precision: 0.7659 - recall: 0.7115 - val_loss: 0.6450 - val_accuracy: 0.7942 - val_precision: 0.8433 - val_recall: 0.7531 Epoch 9/50 81/81 [==============================] - 35s 437ms/step - loss: 0.5946 - accuracy: 0.7620 - precision: 0.8147 - recall: 0.7397 - val_loss: 0.6492 - val_accuracy: 0.7737 - val_precision: 0.8102 - val_recall: 0.7202 Epoch 10/50 81/81 [==============================] - 35s 435ms/step - loss: 0.6089 - accuracy: 0.7454 - precision: 0.7981 - recall: 0.7107 - val_loss: 0.5051 - val_accuracy: 0.8230 - val_precision: 0.8455 - val_recall: 0.8107 Epoch 11/50 81/81 [==============================] - 35s 439ms/step - loss: 0.5189 - accuracy: 0.7967 - precision: 0.8321 - recall: 0.7661 - val_loss: 0.5403 - val_accuracy: 0.8354 - val_precision: 0.8534 - val_recall: 0.8148 Epoch 12/50 81/81 [==============================] - 35s 434ms/step - loss: 0.4296 - accuracy: 0.8428 - precision: 0.8643 - recall: 0.8189 - val_loss: 0.4980 - val_accuracy: 0.8148 - val_precision: 0.8571 - val_recall: 0.7901 Epoch 13/50 81/81 [==============================] - 36s 442ms/step - loss: 0.4536 - accuracy: 0.8336 - precision: 0.8492 - recall: 0.7920 - val_loss: 0.5014 - val_accuracy: 0.8066 - val_precision: 0.8340 - val_recall: 0.8066 Epoch 14/50 81/81 [==============================] - 36s 441ms/step - loss: 0.4687 - accuracy: 0.8021 - precision: 0.8380 - recall: 0.7742 - val_loss: 0.4536 - val_accuracy: 0.8395 - val_precision: 0.8673 - val_recall: 0.8066 Epoch 15/50 81/81 [==============================] - 36s 440ms/step - loss: 0.4147 - accuracy: 0.8611 - precision: 0.8825 - recall: 0.8334 - val_loss: 0.4746 - val_accuracy: 0.8519 - val_precision: 0.8596 - val_recall: 0.8066 Epoch 16/50 81/81 [==============================] - 36s 446ms/step - loss: 0.3935 - accuracy: 0.8465 - precision: 0.8792 - recall: 0.8293 - val_loss: 0.5061 - val_accuracy: 0.8395 - val_precision: 0.8615 - val_recall: 0.8189 Epoch 17/50 81/81 [==============================] - 35s 438ms/step - loss: 0.4669 - accuracy: 0.8343 - precision: 0.8496 - recall: 0.8004 - val_loss: 0.4635 - val_accuracy: 0.8683 - val_precision: 0.8783 - val_recall: 0.8313 Epoch 18/50 81/81 [==============================] - 35s 429ms/step - loss: 0.3684 - accuracy: 0.8602 - precision: 0.8726 - recall: 0.8410 - val_loss: 0.5013 - val_accuracy: 0.8313 - val_precision: 0.8559 - val_recall: 0.8066 Epoch 19/50 81/81 [==============================] - 35s 434ms/step - loss: 0.4621 - accuracy: 0.8182 - precision: 0.8398 - recall: 0.8005 - val_loss: 0.4546 - val_accuracy: 0.8765 - val_precision: 0.8841 - val_recall: 0.8477 Epoch 20/50 81/81 [==============================] - 35s 433ms/step - loss: 0.4505 - accuracy: 0.8169 - precision: 0.8599 - recall: 0.8050 - val_loss: 0.4643 - val_accuracy: 0.8560 - val_precision: 0.8627 - val_recall: 0.8272 Epoch 21/50 81/81 [==============================] - 35s 434ms/step - loss: 0.3947 - accuracy: 0.8482 - precision: 0.8586 - recall: 0.8314 - val_loss: 0.4045 - val_accuracy: 0.8601 - val_precision: 0.8729 - val_recall: 0.8477 Epoch 22/50 81/81 [==============================] - 35s 435ms/step - loss: 0.4144 - accuracy: 0.8602 - precision: 0.8710 - recall: 0.8452 - val_loss: 0.4056 - val_accuracy: 0.8642 - val_precision: 0.8879 - val_recall: 0.8477 Epoch 23/50 81/81 [==============================] - 35s 436ms/step - loss: 0.4061 - accuracy: 0.8494 - precision: 0.8528 - recall: 0.8306 - val_loss: 0.3822 - val_accuracy: 0.8930 - val_precision: 0.9119 - val_recall: 0.8519 Epoch 24/50 81/81 [==============================] - 35s 430ms/step - loss: 0.3608 - accuracy: 0.8868 - precision: 0.9058 - recall: 0.8547 - val_loss: 0.3993 - val_accuracy: 0.8724 - val_precision: 0.8734 - val_recall: 0.8519 Epoch 25/50 81/81 [==============================] - 35s 440ms/step - loss: 0.3540 - accuracy: 0.8685 - precision: 0.8857 - recall: 0.8380 - val_loss: 0.4857 - val_accuracy: 0.8230 - val_precision: 0.8412 - val_recall: 0.8066 Epoch 26/50 81/81 [==============================] - 36s 441ms/step - loss: 0.3674 - accuracy: 0.8751 - precision: 0.8858 - recall: 0.8508 - val_loss: 0.3778 - val_accuracy: 0.8807 - val_precision: 0.8851 - val_recall: 0.8560 Epoch 27/50 81/81 [==============================] - 36s 436ms/step - loss: 0.3695 - accuracy: 0.8453 - precision: 0.8525 - recall: 0.8348 - val_loss: 0.4256 - val_accuracy: 0.8436 - val_precision: 0.8578 - val_recall: 0.8189 Epoch 28/50 81/81 [==============================] - 36s 442ms/step - loss: 0.3221 - accuracy: 0.8806 - precision: 0.8990 - recall: 0.8674 - val_loss: 0.3614 - val_accuracy: 0.8724 - val_precision: 0.8932 - val_recall: 0.8601 Epoch 29/50 81/81 [==============================] - 35s 438ms/step - loss: 0.3435 - accuracy: 0.8852 - precision: 0.8961 - recall: 0.8748 - val_loss: 0.4669 - val_accuracy: 0.8230 - val_precision: 0.8326 - val_recall: 0.8189 Epoch 30/50 81/81 [==============================] - 35s 435ms/step - loss: 0.3976 - accuracy: 0.8539 - precision: 0.8711 - recall: 0.8435 - val_loss: 0.3504 - val_accuracy: 0.8848 - val_precision: 0.8898 - val_recall: 0.8642 Epoch 31/50 81/81 [==============================] - 35s 433ms/step - loss: 0.3392 - accuracy: 0.8759 - precision: 0.8932 - recall: 0.8582 - val_loss: 0.3575 - val_accuracy: 0.8848 - val_precision: 0.8987 - val_recall: 0.8765 Epoch 32/50 81/81 [==============================] - 35s 437ms/step - loss: 0.3342 - accuracy: 0.8962 - precision: 0.9050 - recall: 0.8674 - val_loss: 0.4780 - val_accuracy: 0.8272 - val_precision: 0.8304 - val_recall: 0.7860 Epoch 33/50 81/81 [==============================] - 35s 436ms/step - loss: 0.3754 - accuracy: 0.8682 - precision: 0.8910 - recall: 0.8539 - val_loss: 0.4310 - val_accuracy: 0.8436 - val_precision: 0.8602 - val_recall: 0.8354 Epoch 34/50 81/81 [==============================] - 35s 438ms/step - loss: 0.3411 - accuracy: 0.8736 - precision: 0.8910 - recall: 0.8643 - val_loss: 0.3567 - val_accuracy: 0.8848 - val_precision: 0.8936 - val_recall: 0.8642 Epoch 35/50 81/81 [==============================] - 35s 437ms/step - loss: 0.3975 - accuracy: 0.8493 - precision: 0.8659 - recall: 0.8305 - val_loss: 0.3747 - val_accuracy: 0.8642 - val_precision: 0.8918 - val_recall: 0.8477 Epoch 36/50 81/81 [==============================] - 35s 439ms/step - loss: 0.3110 - accuracy: 0.8880 - precision: 0.9041 - recall: 0.8763 - val_loss: 0.5114 - val_accuracy: 0.8313 - val_precision: 0.8375 - val_recall: 0.8272 Epoch 37/50 81/81 [==============================] - 35s 435ms/step - loss: 0.4586 - accuracy: 0.8244 - precision: 0.8358 - recall: 0.8050 - val_loss: 0.3434 - val_accuracy: 0.8683 - val_precision: 0.8894 - val_recall: 0.8601 Epoch 38/50 81/81 [==============================] - 35s 435ms/step - loss: 0.3380 - accuracy: 0.8589 - precision: 0.8714 - recall: 0.8522 - val_loss: 0.3899 - val_accuracy: 0.8683 - val_precision: 0.8745 - val_recall: 0.8601 Epoch 39/50 81/81 [==============================] - 35s 435ms/step - loss: 0.2975 - accuracy: 0.8878 - precision: 0.8946 - recall: 0.8791 - val_loss: 0.3769 - val_accuracy: 0.8930 - val_precision: 0.9095 - val_recall: 0.8683 Epoch 40/50 81/81 [==============================] - 35s 436ms/step - loss: 0.3364 - accuracy: 0.8826 - precision: 0.8876 - recall: 0.8614 - val_loss: 0.3418 - val_accuracy: 0.8724 - val_precision: 0.8782 - val_recall: 0.8601 Epoch 41/50 81/81 [==============================] - 35s 440ms/step - loss: 0.3260 - accuracy: 0.8789 - precision: 0.9014 - recall: 0.8643 - val_loss: 0.3060 - val_accuracy: 0.9012 - val_precision: 0.9149 - val_recall: 0.8848 Epoch 42/50 81/81 [==============================] - 35s 437ms/step - loss: 0.2818 - accuracy: 0.9079 - precision: 0.9183 - recall: 0.8909 - val_loss: 0.3427 - val_accuracy: 0.8971 - val_precision: 0.9025 - val_recall: 0.8765 Epoch 43/50 81/81 [==============================] - 35s 440ms/step - loss: 0.3285 - accuracy: 0.8729 - precision: 0.8923 - recall: 0.8564 - val_loss: 0.3854 - val_accuracy: 0.8848 - val_precision: 0.8908 - val_recall: 0.8724 Epoch 44/50 81/81 [==============================] - 35s 436ms/step - loss: 0.3107 - accuracy: 0.9070 - precision: 0.9168 - recall: 0.8825 - val_loss: 0.3543 - val_accuracy: 0.8765 - val_precision: 0.8884 - val_recall: 0.8519 Epoch 45/50 81/81 [==============================] - 35s 438ms/step - loss: 0.3198 - accuracy: 0.8826 - precision: 0.8953 - recall: 0.8617 - val_loss: 0.2966 - val_accuracy: 0.9053 - val_precision: 0.9153 - val_recall: 0.8889 Epoch 46/50 81/81 [==============================] - 35s 431ms/step - loss: 0.3456 - accuracy: 0.8689 - precision: 0.8847 - recall: 0.8555 - val_loss: 0.3326 - val_accuracy: 0.8848 - val_precision: 0.9060 - val_recall: 0.8724 Epoch 47/50 81/81 [==============================] - 35s 433ms/step - loss: 0.3766 - accuracy: 0.8606 - precision: 0.8734 - recall: 0.8414 - val_loss: 0.3231 - val_accuracy: 0.8601 - val_precision: 0.8798 - val_recall: 0.8436 Epoch 48/50 81/81 [==============================] - 35s 433ms/step - loss: 0.2723 - accuracy: 0.9039 - precision: 0.9103 - recall: 0.8955 - val_loss: 0.4387 - val_accuracy: 0.8313 - val_precision: 0.8354 - val_recall: 0.8148 Epoch 49/50 81/81 [==============================] - 35s 436ms/step - loss: 0.3650 - accuracy: 0.8576 - precision: 0.8661 - recall: 0.8511 - val_loss: 0.3371 - val_accuracy: 0.8683 - val_precision: 0.8798 - val_recall: 0.8436 Epoch 50/50 81/81 [==============================] - 35s 435ms/step - loss: 0.2463 - accuracy: 0.9149 - precision: 0.9341 - recall: 0.8927 - val_loss: 0.3567 - val_accuracy: 0.8683 - val_precision: 0.8782 - val_recall: 0.8601
plt.subplots_adjust(right=1.95, left=.03)
plt.subplot(1,3,1)
plt.plot(augmented_model_history.history['accuracy'])
plt.plot(augmented_model_history.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(augmented_model_history.history['precision'])
plt.plot(augmented_model_history.history['val_precision'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
#plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,3)
plt.plot(augmented_model_history.history['recall'])
plt.plot(augmented_model_history.history['val_recall'])
plt.ylabel('Recall')
plt.xlabel('')
#plt.legend(['training','validation'], loc="lower right")
plt.show()
test_loss, test_acc, test_precision, test_recall = model_1.evaluate(augmented_testing)
print('%s %.2f' % ('validation_acc: ', test_acc*100.0 ))
print('%s %.2f' % ('validation_loss:', test_loss ))
print('%s %.2f' % ('validation_precision:', test_precision ))
print('%s %.2f' % ('validation_recall:', test_recall ))
27/27 [==============================] - 8s 309ms/step - loss: 0.3396 - accuracy: 0.8765 - precision: 0.8889 - recall: 0.8560 validation_acc: 87.65 validation_loss: 0.34 validation_precision: 0.89 validation_recall: 0.86
del model_1
Augmentation 4
We will next try to see how our model performs when we reduce our color channels from 3 to 1. This will test whether color is a critical feature for the model.
augmented_training = augmented_generator.flow_from_directory( 'bears/training', target_size=(256, 256), batch_size=9, class_mode='categorical', color_mode='grayscale')
augmented_validation = augmented_generator.flow_from_directory( 'bears/validation', target_size=(256, 256), batch_size=9, class_mode='categorical', color_mode='grayscale')
augmented_testing = augmented_generator.flow_from_directory( 'bears/test', target_size=(256, 256), batch_size=9, class_mode='categorical', color_mode='grayscale')
Found 727 images belonging to 3 classes. Found 243 images belonging to 3 classes. Found 243 images belonging to 3 classes.
L = 3
plt.figure(figsize=(20,16))
for my_batch in augmented_training:
images = my_batch[0]
labels = my_batch[1]
for r in range(0,3):
for c in range(0,3):
plt.subplot(L,L,r * L + c + 1)
plt.axis('off')
#plt.title(labels[r * L + c])
plt.imshow(np.squeeze(images[r * L + c]), cmap="gray")
break
model_1.compile( optimizer = 'adam', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
augmented_model_history = model_1.fit( augmented_training, validation_data = augmented_validation, epochs = 50, batch_size = 64, callbacks = [callback_earlystp] )
Epoch 1/50 81/81 [==============================] - 31s 380ms/step - loss: 1.0841 - accuracy: 0.3439 - precision_1: 0.0000e+00 - recall_1: 0.0000e+00 - val_loss: 1.0270 - val_accuracy: 0.4362 - val_precision_1: 1.0000 - val_recall_1: 0.0041 Epoch 2/50 81/81 [==============================] - 30s 378ms/step - loss: 1.0223 - accuracy: 0.4724 - precision_1: 0.5380 - recall_1: 0.0251 - val_loss: 0.9858 - val_accuracy: 0.5309 - val_precision_1: 0.5645 - val_recall_1: 0.1440 Epoch 3/50 81/81 [==============================] - 30s 378ms/step - loss: 0.9741 - accuracy: 0.5574 - precision_1: 0.5555 - recall_1: 0.1800 - val_loss: 0.9562 - val_accuracy: 0.5514 - val_precision_1: 0.6389 - val_recall_1: 0.1893 Epoch 4/50 81/81 [==============================] - 30s 375ms/step - loss: 0.9805 - accuracy: 0.5529 - precision_1: 0.6083 - recall_1: 0.1978 - val_loss: 0.9265 - val_accuracy: 0.5226 - val_precision_1: 0.5702 - val_recall_1: 0.2675 Epoch 5/50 81/81 [==============================] - 30s 377ms/step - loss: 0.9247 - accuracy: 0.5755 - precision_1: 0.6307 - recall_1: 0.2371 - val_loss: 0.9232 - val_accuracy: 0.5432 - val_precision_1: 0.5481 - val_recall_1: 0.3045 Epoch 6/50 81/81 [==============================] - 30s 378ms/step - loss: 0.9474 - accuracy: 0.5906 - precision_1: 0.6008 - recall_1: 0.2173 - val_loss: 0.8704 - val_accuracy: 0.6584 - val_precision_1: 0.7639 - val_recall_1: 0.2263 Epoch 7/50 81/81 [==============================] - 30s 377ms/step - loss: 0.8467 - accuracy: 0.6496 - precision_1: 0.7214 - recall_1: 0.2820 - val_loss: 0.8115 - val_accuracy: 0.6584 - val_precision_1: 0.6486 - val_recall_1: 0.2963 Epoch 8/50 81/81 [==============================] - 30s 377ms/step - loss: 0.8143 - accuracy: 0.6778 - precision_1: 0.7554 - recall_1: 0.3041 - val_loss: 0.8011 - val_accuracy: 0.7531 - val_precision_1: 0.8000 - val_recall_1: 0.2469 Epoch 9/50 81/81 [==============================] - 30s 374ms/step - loss: 0.8072 - accuracy: 0.7206 - precision_1: 0.7137 - recall_1: 0.2545 - val_loss: 0.7918 - val_accuracy: 0.8107 - val_precision_1: 0.8387 - val_recall_1: 0.4280 Epoch 10/50 81/81 [==============================] - 30s 376ms/step - loss: 0.7616 - accuracy: 0.7490 - precision_1: 0.7925 - recall_1: 0.4799 - val_loss: 0.7369 - val_accuracy: 0.7737 - val_precision_1: 0.8926 - val_recall_1: 0.5473 Epoch 11/50 81/81 [==============================] - 30s 370ms/step - loss: 0.7515 - accuracy: 0.7549 - precision_1: 0.8503 - recall_1: 0.5934 - val_loss: 0.7013 - val_accuracy: 0.7654 - val_precision_1: 0.8105 - val_recall_1: 0.6337 Epoch 12/50 81/81 [==============================] - 30s 376ms/step - loss: 0.7185 - accuracy: 0.7777 - precision_1: 0.8229 - recall_1: 0.6379 - val_loss: 0.6746 - val_accuracy: 0.7860 - val_precision_1: 0.8474 - val_recall_1: 0.6626 Epoch 13/50 81/81 [==============================] - 30s 374ms/step - loss: 0.7017 - accuracy: 0.7860 - precision_1: 0.8512 - recall_1: 0.7104 - val_loss: 0.6683 - val_accuracy: 0.7778 - val_precision_1: 0.8684 - val_recall_1: 0.6790 Epoch 14/50 81/81 [==============================] - 30s 376ms/step - loss: 0.6266 - accuracy: 0.8140 - precision_1: 0.8754 - recall_1: 0.7549 - val_loss: 0.6146 - val_accuracy: 0.8230 - val_precision_1: 0.8670 - val_recall_1: 0.7778 Epoch 15/50 81/81 [==============================] - 30s 377ms/step - loss: 0.6278 - accuracy: 0.7804 - precision_1: 0.8366 - recall_1: 0.7284 - val_loss: 0.5505 - val_accuracy: 0.8683 - val_precision_1: 0.8904 - val_recall_1: 0.8025 Epoch 16/50 81/81 [==============================] - 30s 377ms/step - loss: 0.6306 - accuracy: 0.7946 - precision_1: 0.8353 - recall_1: 0.7145 - val_loss: 0.5655 - val_accuracy: 0.8395 - val_precision_1: 0.9073 - val_recall_1: 0.7654 Epoch 17/50 81/81 [==============================] - 30s 375ms/step - loss: 0.5722 - accuracy: 0.8330 - precision_1: 0.8678 - recall_1: 0.7820 - val_loss: 0.5558 - val_accuracy: 0.8477 - val_precision_1: 0.8761 - val_recall_1: 0.8148 Epoch 18/50 81/81 [==============================] - 30s 375ms/step - loss: 0.5916 - accuracy: 0.8033 - precision_1: 0.8371 - recall_1: 0.7654 - val_loss: 0.5756 - val_accuracy: 0.8230 - val_precision_1: 0.8500 - val_recall_1: 0.7695 Epoch 19/50 81/81 [==============================] - 30s 376ms/step - loss: 0.5621 - accuracy: 0.8150 - precision_1: 0.8504 - recall_1: 0.7718 - val_loss: 0.5435 - val_accuracy: 0.8313 - val_precision_1: 0.8678 - val_recall_1: 0.8107 Epoch 20/50 81/81 [==============================] - 30s 375ms/step - loss: 0.5563 - accuracy: 0.8295 - precision_1: 0.8814 - recall_1: 0.7890 - val_loss: 0.5338 - val_accuracy: 0.7984 - val_precision_1: 0.8363 - val_recall_1: 0.7778 Epoch 21/50 81/81 [==============================] - 30s 375ms/step - loss: 0.5606 - accuracy: 0.8044 - precision_1: 0.8301 - recall_1: 0.7651 - val_loss: 0.4776 - val_accuracy: 0.8519 - val_precision_1: 0.8739 - val_recall_1: 0.8272 Epoch 22/50 81/81 [==============================] - 30s 376ms/step - loss: 0.5248 - accuracy: 0.8312 - precision_1: 0.8540 - recall_1: 0.7869 - val_loss: 0.4847 - val_accuracy: 0.8519 - val_precision_1: 0.8718 - val_recall_1: 0.8395 Epoch 23/50 81/81 [==============================] - 30s 376ms/step - loss: 0.5053 - accuracy: 0.8343 - precision_1: 0.8691 - recall_1: 0.8005 - val_loss: 0.5073 - val_accuracy: 0.8354 - val_precision_1: 0.8509 - val_recall_1: 0.7984 Epoch 24/50 81/81 [==============================] - 31s 377ms/step - loss: 0.4990 - accuracy: 0.8335 - precision_1: 0.8623 - recall_1: 0.7944 - val_loss: 0.4836 - val_accuracy: 0.8519 - val_precision_1: 0.8707 - val_recall_1: 0.8313 Epoch 25/50 81/81 [==============================] - 31s 379ms/step - loss: 0.5058 - accuracy: 0.8163 - precision_1: 0.8399 - recall_1: 0.7874 - val_loss: 0.4877 - val_accuracy: 0.8148 - val_precision_1: 0.8326 - val_recall_1: 0.7778 Epoch 26/50 81/81 [==============================] - 31s 383ms/step - loss: 0.4416 - accuracy: 0.8650 - precision_1: 0.8859 - recall_1: 0.8184 - val_loss: 0.4494 - val_accuracy: 0.8642 - val_precision_1: 0.8918 - val_recall_1: 0.8477 Epoch 27/50 81/81 [==============================] - 31s 381ms/step - loss: 0.4654 - accuracy: 0.8348 - precision_1: 0.8767 - recall_1: 0.8077 - val_loss: 0.4479 - val_accuracy: 0.8560 - val_precision_1: 0.8855 - val_recall_1: 0.8272 Epoch 28/50 81/81 [==============================] - 30s 376ms/step - loss: 0.4722 - accuracy: 0.8422 - precision_1: 0.8574 - recall_1: 0.8137 - val_loss: 0.4161 - val_accuracy: 0.8724 - val_precision_1: 0.9004 - val_recall_1: 0.8560 Epoch 29/50 81/81 [==============================] - 30s 374ms/step - loss: 0.4384 - accuracy: 0.8671 - precision_1: 0.8870 - recall_1: 0.8323 - val_loss: 0.4247 - val_accuracy: 0.8807 - val_precision_1: 0.8952 - val_recall_1: 0.8436 Epoch 30/50 81/81 [==============================] - 30s 376ms/step - loss: 0.4508 - accuracy: 0.8524 - precision_1: 0.8671 - recall_1: 0.8212 - val_loss: 0.4624 - val_accuracy: 0.8477 - val_precision_1: 0.8717 - val_recall_1: 0.8107 Epoch 31/50 81/81 [==============================] - 30s 371ms/step - loss: 0.4846 - accuracy: 0.8358 - precision_1: 0.8518 - recall_1: 0.7915 - val_loss: 0.4236 - val_accuracy: 0.8765 - val_precision_1: 0.8879 - val_recall_1: 0.8477 Epoch 32/50 81/81 [==============================] - 30s 372ms/step - loss: 0.4334 - accuracy: 0.8276 - precision_1: 0.8476 - recall_1: 0.8070 - val_loss: 0.4304 - val_accuracy: 0.8724 - val_precision_1: 0.8850 - val_recall_1: 0.8230 Epoch 33/50 81/81 [==============================] - 30s 376ms/step - loss: 0.4534 - accuracy: 0.8291 - precision_1: 0.8647 - recall_1: 0.8070 - val_loss: 0.4441 - val_accuracy: 0.8642 - val_precision_1: 0.8644 - val_recall_1: 0.8395 Epoch 34/50 81/81 [==============================] - 30s 374ms/step - loss: 0.3963 - accuracy: 0.8755 - precision_1: 0.8857 - recall_1: 0.8573 - val_loss: 0.4926 - val_accuracy: 0.8272 - val_precision_1: 0.8414 - val_recall_1: 0.7860 Epoch 35/50 81/81 [==============================] - 30s 374ms/step - loss: 0.4356 - accuracy: 0.8476 - precision_1: 0.8745 - recall_1: 0.7969 - val_loss: 0.4079 - val_accuracy: 0.8848 - val_precision_1: 0.8991 - val_recall_1: 0.8436 Epoch 36/50 81/81 [==============================] - 30s 372ms/step - loss: 0.4968 - accuracy: 0.8185 - precision_1: 0.8460 - recall_1: 0.7826 - val_loss: 0.3621 - val_accuracy: 0.8724 - val_precision_1: 0.9031 - val_recall_1: 0.8436 Epoch 37/50 81/81 [==============================] - 30s 376ms/step - loss: 0.4033 - accuracy: 0.8601 - precision_1: 0.8854 - recall_1: 0.8487 - val_loss: 0.3638 - val_accuracy: 0.8683 - val_precision_1: 0.9127 - val_recall_1: 0.8601 Epoch 38/50 81/81 [==============================] - 30s 375ms/step - loss: 0.4547 - accuracy: 0.8416 - precision_1: 0.8586 - recall_1: 0.8101 - val_loss: 0.3718 - val_accuracy: 0.8889 - val_precision_1: 0.9258 - val_recall_1: 0.8724 Epoch 39/50 81/81 [==============================] - 30s 373ms/step - loss: 0.4761 - accuracy: 0.8277 - precision_1: 0.8360 - recall_1: 0.7955 - val_loss: 0.4079 - val_accuracy: 0.8477 - val_precision_1: 0.8783 - val_recall_1: 0.8313 Epoch 40/50 81/81 [==============================] - 31s 371ms/step - loss: 0.3784 - accuracy: 0.8608 - precision_1: 0.8852 - recall_1: 0.8478 - val_loss: 0.4072 - val_accuracy: 0.8272 - val_precision_1: 0.8578 - val_recall_1: 0.8189 Epoch 41/50 81/81 [==============================] - 30s 376ms/step - loss: 0.3885 - accuracy: 0.8736 - precision_1: 0.8944 - recall_1: 0.8343 - val_loss: 0.3383 - val_accuracy: 0.8889 - val_precision_1: 0.8979 - val_recall_1: 0.8683 Epoch 42/50 81/81 [==============================] - 30s 377ms/step - loss: 0.3417 - accuracy: 0.8817 - precision_1: 0.8973 - recall_1: 0.8639 - val_loss: 0.4283 - val_accuracy: 0.8477 - val_precision_1: 0.8536 - val_recall_1: 0.8395 Epoch 43/50 81/81 [==============================] - 30s 376ms/step - loss: 0.4254 - accuracy: 0.8216 - precision_1: 0.8436 - recall_1: 0.7995 - val_loss: 0.4411 - val_accuracy: 0.8560 - val_precision_1: 0.8750 - val_recall_1: 0.8354 Epoch 44/50 81/81 [==============================] - 30s 375ms/step - loss: 0.4208 - accuracy: 0.8436 - precision_1: 0.8692 - recall_1: 0.8294 - val_loss: 0.3183 - val_accuracy: 0.8971 - val_precision_1: 0.9333 - val_recall_1: 0.8642 Epoch 45/50 81/81 [==============================] - 30s 376ms/step - loss: 0.3660 - accuracy: 0.8990 - precision_1: 0.9142 - recall_1: 0.8663 - val_loss: 0.3478 - val_accuracy: 0.9012 - val_precision_1: 0.9153 - val_recall_1: 0.8889 Epoch 46/50 81/81 [==============================] - 31s 381ms/step - loss: 0.3571 - accuracy: 0.8915 - precision_1: 0.8996 - recall_1: 0.8558 - val_loss: 0.3351 - val_accuracy: 0.8930 - val_precision_1: 0.9056 - val_recall_1: 0.8683 Epoch 47/50 81/81 [==============================] - 30s 376ms/step - loss: 0.3639 - accuracy: 0.8745 - precision_1: 0.8888 - recall_1: 0.8688 - val_loss: 0.4456 - val_accuracy: 0.8560 - val_precision_1: 0.8865 - val_recall_1: 0.8354 Epoch 48/50 81/81 [==============================] - 30s 375ms/step - loss: 0.3897 - accuracy: 0.8667 - precision_1: 0.8829 - recall_1: 0.8319 - val_loss: 0.3189 - val_accuracy: 0.8889 - val_precision_1: 0.9060 - val_recall_1: 0.8724 Epoch 49/50 81/81 [==============================] - 30s 375ms/step - loss: 0.3857 - accuracy: 0.8622 - precision_1: 0.8813 - recall_1: 0.8364 - val_loss: 0.3161 - val_accuracy: 0.9177 - val_precision_1: 0.9227 - val_recall_1: 0.8848 Epoch 50/50 81/81 [==============================] - 30s 370ms/step - loss: 0.3632 - accuracy: 0.8481 - precision_1: 0.8785 - recall_1: 0.8292 - val_loss: 0.3207 - val_accuracy: 0.8930 - val_precision_1: 0.9060 - val_recall_1: 0.8724
plt.subplots_adjust(right=1.95, left=.03)
plt.subplot(1,3,1)
plt.plot(augmented_model_history.history['accuracy'])
plt.plot(augmented_model_history.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(augmented_model_history.history['precision_1'])
plt.plot(augmented_model_history.history['val_precision_1'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
#plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,3)
plt.plot(augmented_model_history.history['recall_1'])
plt.plot(augmented_model_history.history['val_recall_1'])
plt.ylabel('Recall')
plt.xlabel('')
#plt.legend(['training','validation'], loc="lower right")
plt.show()
test_loss, test_acc, test_precision, test_recall = model_1.evaluate(augmented_testing)
print('%s %.2f' % ('validation_acc: ', test_acc*100.0 ))
print('%s %.2f' % ('validation_loss:', test_loss ))
print('%s %.2f' % ('validation_precision:', test_precision ))
print('%s %.2f' % ('validation_recall:', test_recall ))
27/27 [==============================] - 7s 269ms/step - loss: 0.4311 - accuracy: 0.8477 - precision_1: 0.8690 - recall_1: 0.8189 validation_acc: 84.77 validation_loss: 0.43 validation_precision: 0.87 validation_recall: 0.82
del model_1
We will next apply a few regularization methods (batch normalization and dropout) to see if we can improve accuracy.
Regularization #1
First, we will add BatchNormalization to our best-fit model.
from tensorflow.keras.layers import BatchNormalization, Dropout
model_1 = Sequential()
model_1.add( BatchNormalization())
model_1.add( Conv2D(filters=16, kernel_size=3, activation = 'relu', input_shape = training_data.image_shape ) )
model_1.add( MaxPool2D(5,5))
model_1.add( BatchNormalization())
model_1.add( Conv2D(filters=8, kernel_size=3, activation = 'relu' ) )
model_1.add( MaxPool2D(5,5))
model_1.add( BatchNormalization())
model_1.add( Conv2D(filters=4, kernel_size=3, activation = 'relu' ) )
model_1.add( MaxPool2D(5,5))
model_1.add( Flatten())
model_1.add( Dense(units=5, activation = 'relu' ) )
model_1.add( Dense(units=3, activation = 'softmax' ) )
model_1.compile( optimizer = 'adam', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
fit_history_1 = model_1.fit( training_data, validation_data = validation_data, epochs = 50, batch_size = 64, callbacks = [callback_earlystp] )
Epoch 1/50 81/81 [==============================] - 27s 319ms/step - loss: 1.1176 - accuracy: 0.3880 - precision_2: 0.3590 - recall_2: 0.0293 - val_loss: 1.1204 - val_accuracy: 0.3086 - val_precision_2: 0.0000e+00 - val_recall_2: 0.0000e+00 Epoch 2/50 81/81 [==============================] - 25s 307ms/step - loss: 0.9541 - accuracy: 0.4929 - precision_2: 0.8080 - recall_2: 0.1600 - val_loss: 1.1740 - val_accuracy: 0.5144 - val_precision_2: 0.4000 - val_recall_2: 0.0082 Epoch 3/50 81/81 [==============================] - 25s 306ms/step - loss: 0.8102 - accuracy: 0.6810 - precision_2: 0.7960 - recall_2: 0.4301 - val_loss: 1.0802 - val_accuracy: 0.4938 - val_precision_2: 0.5808 - val_recall_2: 0.3992 Epoch 4/50 81/81 [==============================] - 25s 309ms/step - loss: 0.6802 - accuracy: 0.7586 - precision_2: 0.8160 - recall_2: 0.6550 - val_loss: 0.7704 - val_accuracy: 0.7119 - val_precision_2: 0.7290 - val_recall_2: 0.6420 Epoch 5/50 81/81 [==============================] - 25s 307ms/step - loss: 0.6112 - accuracy: 0.7760 - precision_2: 0.8384 - recall_2: 0.7254 - val_loss: 0.6296 - val_accuracy: 0.7572 - val_precision_2: 0.7773 - val_recall_2: 0.7325 Epoch 6/50 81/81 [==============================] - 25s 308ms/step - loss: 0.4982 - accuracy: 0.8307 - precision_2: 0.8452 - recall_2: 0.7796 - val_loss: 0.5134 - val_accuracy: 0.8025 - val_precision_2: 0.8170 - val_recall_2: 0.7901 Epoch 7/50 81/81 [==============================] - 25s 310ms/step - loss: 0.4791 - accuracy: 0.8147 - precision_2: 0.8477 - recall_2: 0.7890 - val_loss: 0.4781 - val_accuracy: 0.8395 - val_precision_2: 0.8439 - val_recall_2: 0.8230 Epoch 8/50 81/81 [==============================] - 25s 307ms/step - loss: 0.4387 - accuracy: 0.8492 - precision_2: 0.8601 - recall_2: 0.8396 - val_loss: 0.4394 - val_accuracy: 0.8230 - val_precision_2: 0.8432 - val_recall_2: 0.8189 Epoch 9/50 81/81 [==============================] - 25s 308ms/step - loss: 0.3704 - accuracy: 0.8774 - precision_2: 0.8790 - recall_2: 0.8623 - val_loss: 0.4905 - val_accuracy: 0.8313 - val_precision_2: 0.8445 - val_recall_2: 0.8272 Epoch 10/50 81/81 [==============================] - 25s 309ms/step - loss: 0.3683 - accuracy: 0.8760 - precision_2: 0.8857 - recall_2: 0.8686 - val_loss: 0.3911 - val_accuracy: 0.8519 - val_precision_2: 0.8512 - val_recall_2: 0.8477 Epoch 11/50 81/81 [==============================] - 25s 307ms/step - loss: 0.2877 - accuracy: 0.8908 - precision_2: 0.9016 - recall_2: 0.8862 - val_loss: 0.3794 - val_accuracy: 0.8889 - val_precision_2: 0.8958 - val_recall_2: 0.8848 Epoch 12/50 81/81 [==============================] - 25s 304ms/step - loss: 0.3086 - accuracy: 0.8828 - precision_2: 0.8856 - recall_2: 0.8749 - val_loss: 0.3079 - val_accuracy: 0.9053 - val_precision_2: 0.9087 - val_recall_2: 0.9012 Epoch 13/50 81/81 [==============================] - 25s 308ms/step - loss: 0.2378 - accuracy: 0.9177 - precision_2: 0.9217 - recall_2: 0.9172 - val_loss: 0.3379 - val_accuracy: 0.8971 - val_precision_2: 0.8971 - val_recall_2: 0.8971 Epoch 14/50 81/81 [==============================] - 25s 307ms/step - loss: 0.2171 - accuracy: 0.9177 - precision_2: 0.9183 - recall_2: 0.9081 - val_loss: 0.3910 - val_accuracy: 0.8724 - val_precision_2: 0.8708 - val_recall_2: 0.8601 Epoch 15/50 81/81 [==============================] - 25s 308ms/step - loss: 0.2315 - accuracy: 0.9220 - precision_2: 0.9297 - recall_2: 0.9082 - val_loss: 0.2973 - val_accuracy: 0.9136 - val_precision_2: 0.9136 - val_recall_2: 0.9136 Epoch 16/50 81/81 [==============================] - 25s 307ms/step - loss: 0.2095 - accuracy: 0.9379 - precision_2: 0.9391 - recall_2: 0.9312 - val_loss: 0.4334 - val_accuracy: 0.8807 - val_precision_2: 0.8945 - val_recall_2: 0.8724 Epoch 17/50 81/81 [==============================] - 25s 309ms/step - loss: 0.2127 - accuracy: 0.9229 - precision_2: 0.9235 - recall_2: 0.9177 - val_loss: 0.4730 - val_accuracy: 0.8560 - val_precision_2: 0.8560 - val_recall_2: 0.8560 Epoch 18/50 81/81 [==============================] - 25s 308ms/step - loss: 0.2203 - accuracy: 0.9211 - precision_2: 0.9267 - recall_2: 0.9162 - val_loss: 0.2724 - val_accuracy: 0.8930 - val_precision_2: 0.9042 - val_recall_2: 0.8930 Epoch 19/50 81/81 [==============================] - 25s 305ms/step - loss: 0.1545 - accuracy: 0.9440 - precision_2: 0.9459 - recall_2: 0.9430 - val_loss: 0.2648 - val_accuracy: 0.9383 - val_precision_2: 0.9419 - val_recall_2: 0.9342 Epoch 20/50 81/81 [==============================] - 25s 307ms/step - loss: 0.1805 - accuracy: 0.9264 - precision_2: 0.9288 - recall_2: 0.9241 - val_loss: 0.2648 - val_accuracy: 0.8930 - val_precision_2: 0.8963 - val_recall_2: 0.8889 Epoch 21/50 81/81 [==============================] - 25s 306ms/step - loss: 0.1618 - accuracy: 0.9430 - precision_2: 0.9439 - recall_2: 0.9388 - val_loss: 0.2934 - val_accuracy: 0.9095 - val_precision_2: 0.9170 - val_recall_2: 0.9095 Epoch 22/50 81/81 [==============================] - 25s 307ms/step - loss: 0.1477 - accuracy: 0.9417 - precision_2: 0.9493 - recall_2: 0.9408 - val_loss: 0.4687 - val_accuracy: 0.8724 - val_precision_2: 0.8719 - val_recall_2: 0.8683 Epoch 23/50 81/81 [==============================] - 25s 309ms/step - loss: 0.1307 - accuracy: 0.9504 - precision_2: 0.9525 - recall_2: 0.9504 - val_loss: 0.2406 - val_accuracy: 0.9259 - val_precision_2: 0.9298 - val_recall_2: 0.9259 Epoch 24/50 81/81 [==============================] - 25s 308ms/step - loss: 0.1253 - accuracy: 0.9612 - precision_2: 0.9612 - recall_2: 0.9555 - val_loss: 0.2515 - val_accuracy: 0.9053 - val_precision_2: 0.9091 - val_recall_2: 0.9053 Epoch 25/50 81/81 [==============================] - 25s 304ms/step - loss: 0.1350 - accuracy: 0.9493 - precision_2: 0.9503 - recall_2: 0.9434 - val_loss: 0.2742 - val_accuracy: 0.8971 - val_precision_2: 0.9038 - val_recall_2: 0.8889 Epoch 26/50 81/81 [==============================] - 25s 305ms/step - loss: 0.1087 - accuracy: 0.9645 - precision_2: 0.9644 - recall_2: 0.9606 - val_loss: 0.4588 - val_accuracy: 0.8889 - val_precision_2: 0.8926 - val_recall_2: 0.8889 Epoch 27/50 81/81 [==============================] - 25s 307ms/step - loss: 0.1160 - accuracy: 0.9662 - precision_2: 0.9675 - recall_2: 0.9634 - val_loss: 0.2986 - val_accuracy: 0.9218 - val_precision_2: 0.9250 - val_recall_2: 0.9136 Epoch 28/50 81/81 [==============================] - 25s 305ms/step - loss: 0.1143 - accuracy: 0.9645 - precision_2: 0.9656 - recall_2: 0.9618 - val_loss: 0.4270 - val_accuracy: 0.8848 - val_precision_2: 0.8958 - val_recall_2: 0.8848 Epoch 29/50 81/81 [==============================] - 25s 310ms/step - loss: 0.1382 - accuracy: 0.9407 - precision_2: 0.9471 - recall_2: 0.9407 - val_loss: 0.3035 - val_accuracy: 0.9012 - val_precision_2: 0.9012 - val_recall_2: 0.9012 Epoch 30/50 81/81 [==============================] - 25s 305ms/step - loss: 0.0857 - accuracy: 0.9755 - precision_2: 0.9766 - recall_2: 0.9711 - val_loss: 0.2225 - val_accuracy: 0.9300 - val_precision_2: 0.9370 - val_recall_2: 0.9177 Epoch 31/50 81/81 [==============================] - 25s 307ms/step - loss: 0.0875 - accuracy: 0.9723 - precision_2: 0.9723 - recall_2: 0.9721 - val_loss: 0.2610 - val_accuracy: 0.9136 - val_precision_2: 0.9170 - val_recall_2: 0.9095 Epoch 32/50 81/81 [==============================] - 25s 307ms/step - loss: 0.1433 - accuracy: 0.9464 - precision_2: 0.9462 - recall_2: 0.9429 - val_loss: 0.2798 - val_accuracy: 0.9136 - val_precision_2: 0.9208 - val_recall_2: 0.9095 Epoch 33/50 81/81 [==============================] - 25s 309ms/step - loss: 0.0711 - accuracy: 0.9809 - precision_2: 0.9829 - recall_2: 0.9790 - val_loss: 0.2399 - val_accuracy: 0.9177 - val_precision_2: 0.9177 - val_recall_2: 0.9177 Epoch 34/50 81/81 [==============================] - 25s 307ms/step - loss: 0.0685 - accuracy: 0.9782 - precision_2: 0.9783 - recall_2: 0.9774 - val_loss: 0.2227 - val_accuracy: 0.9177 - val_precision_2: 0.9250 - val_recall_2: 0.9136 Epoch 35/50 81/81 [==============================] - 25s 305ms/step - loss: 0.0629 - accuracy: 0.9804 - precision_2: 0.9804 - recall_2: 0.9795 - val_loss: 0.2040 - val_accuracy: 0.9424 - val_precision_2: 0.9461 - val_recall_2: 0.9383 Epoch 36/50 81/81 [==============================] - 25s 309ms/step - loss: 0.0860 - accuracy: 0.9671 - precision_2: 0.9676 - recall_2: 0.9656 - val_loss: 0.2279 - val_accuracy: 0.9465 - val_precision_2: 0.9465 - val_recall_2: 0.9465 Epoch 37/50 81/81 [==============================] - 25s 307ms/step - loss: 0.0631 - accuracy: 0.9794 - precision_2: 0.9792 - recall_2: 0.9731 - val_loss: 0.2531 - val_accuracy: 0.9300 - val_precision_2: 0.9339 - val_recall_2: 0.9300 Epoch 38/50 81/81 [==============================] - 25s 306ms/step - loss: 0.0928 - accuracy: 0.9701 - precision_2: 0.9700 - recall_2: 0.9693 - val_loss: 0.2132 - val_accuracy: 0.9424 - val_precision_2: 0.9421 - val_recall_2: 0.9383 Epoch 39/50 81/81 [==============================] - 25s 309ms/step - loss: 0.0637 - accuracy: 0.9810 - precision_2: 0.9810 - recall_2: 0.9810 - val_loss: 0.2223 - val_accuracy: 0.9342 - val_precision_2: 0.9339 - val_recall_2: 0.9300 Epoch 40/50 81/81 [==============================] - 25s 309ms/step - loss: 0.0592 - accuracy: 0.9812 - precision_2: 0.9812 - recall_2: 0.9812 - val_loss: 0.2406 - val_accuracy: 0.9465 - val_precision_2: 0.9544 - val_recall_2: 0.9465 Epoch 41/50 81/81 [==============================] - 25s 307ms/step - loss: 0.0433 - accuracy: 0.9981 - precision_2: 0.9981 - recall_2: 0.9955 - val_loss: 0.2196 - val_accuracy: 0.9383 - val_precision_2: 0.9421 - val_recall_2: 0.9383 Epoch 42/50 81/81 [==============================] - 25s 304ms/step - loss: 0.0302 - accuracy: 0.9921 - precision_2: 0.9927 - recall_2: 0.9921 - val_loss: 0.2682 - val_accuracy: 0.9259 - val_precision_2: 0.9292 - val_recall_2: 0.9177 Epoch 43/50 81/81 [==============================] - 25s 307ms/step - loss: 0.0405 - accuracy: 0.9842 - precision_2: 0.9843 - recall_2: 0.9834 - val_loss: 0.2762 - val_accuracy: 0.9383 - val_precision_2: 0.9383 - val_recall_2: 0.9383 Epoch 44/50 81/81 [==============================] - 25s 305ms/step - loss: 0.0542 - accuracy: 0.9705 - precision_2: 0.9705 - recall_2: 0.9705 - val_loss: 0.2429 - val_accuracy: 0.9259 - val_precision_2: 0.9253 - val_recall_2: 0.9177 Epoch 45/50 81/81 [==============================] - 25s 304ms/step - loss: 0.0374 - accuracy: 0.9879 - precision_2: 0.9879 - recall_2: 0.9879 - val_loss: 0.3345 - val_accuracy: 0.8889 - val_precision_2: 0.8926 - val_recall_2: 0.8889 Epoch 46/50 81/81 [==============================] - 25s 308ms/step - loss: 0.0916 - accuracy: 0.9744 - precision_2: 0.9743 - recall_2: 0.9728 - val_loss: 0.3055 - val_accuracy: 0.9259 - val_precision_2: 0.9259 - val_recall_2: 0.9259 Epoch 47/50 81/81 [==============================] - 25s 306ms/step - loss: 0.0692 - accuracy: 0.9819 - precision_2: 0.9819 - recall_2: 0.9819 - val_loss: 0.1972 - val_accuracy: 0.9630 - val_precision_2: 0.9630 - val_recall_2: 0.9630 Epoch 48/50 81/81 [==============================] - 25s 307ms/step - loss: 0.0317 - accuracy: 0.9927 - precision_2: 0.9927 - recall_2: 0.9892 - val_loss: 0.2465 - val_accuracy: 0.9342 - val_precision_2: 0.9380 - val_recall_2: 0.9342 Epoch 49/50 81/81 [==============================] - 25s 309ms/step - loss: 0.0546 - accuracy: 0.9825 - precision_2: 0.9842 - recall_2: 0.9825 - val_loss: 0.3251 - val_accuracy: 0.9095 - val_precision_2: 0.9132 - val_recall_2: 0.9095 Epoch 50/50 81/81 [==============================] - 25s 308ms/step - loss: 0.0604 - accuracy: 0.9811 - precision_2: 0.9871 - recall_2: 0.9811 - val_loss: 0.2022 - val_accuracy: 0.9588 - val_precision_2: 0.9667 - val_recall_2: 0.9547
model_1.summary()
Model: "sequential_6" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= batch_normalization_6 (Batch (None, None, None, 3) 12 _________________________________________________________________ conv2d_12 (Conv2D) (None, None, None, 16) 448 _________________________________________________________________ max_pooling2d_12 (MaxPooling (None, None, None, 16) 0 _________________________________________________________________ batch_normalization_7 (Batch (None, None, None, 16) 64 _________________________________________________________________ conv2d_13 (Conv2D) (None, None, None, 8) 1160 _________________________________________________________________ max_pooling2d_13 (MaxPooling (None, None, None, 8) 0 _________________________________________________________________ batch_normalization_8 (Batch (None, None, None, 8) 32 _________________________________________________________________ conv2d_14 (Conv2D) (None, None, None, 4) 292 _________________________________________________________________ max_pooling2d_14 (MaxPooling (None, None, None, 4) 0 _________________________________________________________________ flatten_4 (Flatten) (None, None) 0 _________________________________________________________________ dense_8 (Dense) (None, 5) 25 _________________________________________________________________ dense_9 (Dense) (None, 3) 18 ================================================================= Total params: 2,051 Trainable params: 1,997 Non-trainable params: 54 _________________________________________________________________
plt.subplots_adjust(right=1.95, left=.03)
plt.subplot(1,3,1)
plt.plot(fit_history_1.history['accuracy'])
plt.plot(fit_history_1.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(fit_history_1.history['precision_2'])
plt.plot(fit_history_1.history['val_precision_2'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
#plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,3)
plt.plot(fit_history_1.history['recall_2'])
plt.plot(fit_history_1.history['val_recall_2'])
plt.ylabel('Recall')
plt.xlabel('')
#plt.legend(['training','validation'], loc="lower right")
plt.show()
test_loss, test_acc, test_precision, test_recall = model_1.evaluate(testing_data)
print('%s %.2f' % ('validation_acc: ', test_acc*100.0 ))
print('%s %.2f' % ('validation_loss:', test_loss ))
print('%s %.2f' % ('validation_precision:', test_precision ))
print('%s %.2f' % ('validation_recall:', test_recall ))
27/27 [==============================] - 6s 215ms/step - loss: 0.2080 - accuracy: 0.9259 - precision_2: 0.9259 - recall_2: 0.9259 validation_acc: 92.59 validation_loss: 0.21 validation_precision: 0.93 validation_recall: 0.93
del model_1
Regularization #2
We will next add some Dropout layers.
model_1 = Sequential()
model_1.add( Conv2D(filters=16, kernel_size=3, activation = 'relu', input_shape = training_data.image_shape ) )
model_1.add( MaxPool2D(5,5))
model_1.add( Dropout(rate=0.2))
model_1.add( Conv2D(filters=8, kernel_size=3, activation = 'relu' ) )
model_1.add( MaxPool2D(5,5))
model_1.add( Dropout(rate=0.2))
model_1.add( Conv2D(filters=4, kernel_size=3, activation = 'relu' ) )
model_1.add( MaxPool2D(5,5))
model_1.add( Dropout(rate=0.2))
model_1.add( Flatten())
model_1.add( Dense(units=5, activation = 'relu' ) )
model_1.add( Dense(units=3, activation = 'softmax' ) )
model_1.compile( optimizer = 'adam', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
fit_history_1 = model_1.fit( training_data, validation_data = validation_data, epochs = 50, batch_size = 64, callbacks = [callback_earlystp] )
Epoch 1/50 81/81 [==============================] - 26s 316ms/step - loss: 1.1321 - accuracy: 0.3396 - precision_5: 0.3057 - recall_5: 0.0416 - val_loss: 1.0959 - val_accuracy: 0.4815 - val_precision_5: 0.0000e+00 - val_recall_5: 0.0000e+00 Epoch 2/50 81/81 [==============================] - 25s 310ms/step - loss: 1.0869 - accuracy: 0.4182 - precision_5: 0.4146 - recall_5: 7.2222e-04 - val_loss: 1.0870 - val_accuracy: 0.4938 - val_precision_5: 0.0000e+00 - val_recall_5: 0.0000e+00 Epoch 3/50 81/81 [==============================] - 25s 308ms/step - loss: 1.0785 - accuracy: 0.4463 - precision_5: 0.0244 - recall_5: 6.7686e-05 - val_loss: 1.0712 - val_accuracy: 0.5103 - val_precision_5: 0.0000e+00 - val_recall_5: 0.0000e+00 Epoch 4/50 81/81 [==============================] - 25s 308ms/step - loss: 1.0322 - accuracy: 0.4847 - precision_5: 0.7496 - recall_5: 0.0933 - val_loss: 1.0466 - val_accuracy: 0.5021 - val_precision_5: 0.4000 - val_recall_5: 0.0082 Epoch 5/50 81/81 [==============================] - 25s 310ms/step - loss: 0.9802 - accuracy: 0.5385 - precision_5: 0.6458 - recall_5: 0.1651 - val_loss: 0.9790 - val_accuracy: 0.5926 - val_precision_5: 0.7059 - val_recall_5: 0.0494 Epoch 6/50 81/81 [==============================] - 25s 308ms/step - loss: 0.9130 - accuracy: 0.5933 - precision_5: 0.6855 - recall_5: 0.3512 - val_loss: 0.9403 - val_accuracy: 0.6379 - val_precision_5: 0.7609 - val_recall_5: 0.1440 Epoch 7/50 81/81 [==============================] - 25s 307ms/step - loss: 0.8882 - accuracy: 0.6039 - precision_5: 0.6561 - recall_5: 0.4545 - val_loss: 0.8920 - val_accuracy: 0.6584 - val_precision_5: 0.7564 - val_recall_5: 0.2428 Epoch 8/50 81/81 [==============================] - 25s 307ms/step - loss: 0.8707 - accuracy: 0.5838 - precision_5: 0.6613 - recall_5: 0.4804 - val_loss: 0.8785 - val_accuracy: 0.6790 - val_precision_5: 0.8387 - val_recall_5: 0.3210 Epoch 9/50 81/81 [==============================] - 25s 305ms/step - loss: 0.8416 - accuracy: 0.6314 - precision_5: 0.6557 - recall_5: 0.5072 - val_loss: 0.9164 - val_accuracy: 0.6296 - val_precision_5: 0.7821 - val_recall_5: 0.2510 Epoch 10/50 81/81 [==============================] - 25s 305ms/step - loss: 0.8607 - accuracy: 0.5805 - precision_5: 0.6586 - recall_5: 0.4865 - val_loss: 0.8821 - val_accuracy: 0.6831 - val_precision_5: 0.7763 - val_recall_5: 0.2428 Epoch 11/50 81/81 [==============================] - 25s 308ms/step - loss: 0.8575 - accuracy: 0.5839 - precision_5: 0.6723 - recall_5: 0.4966 - val_loss: 0.8512 - val_accuracy: 0.7119 - val_precision_5: 0.8347 - val_recall_5: 0.4156 Epoch 12/50 81/81 [==============================] - 25s 304ms/step - loss: 0.8075 - accuracy: 0.6415 - precision_5: 0.6999 - recall_5: 0.5404 - val_loss: 0.8339 - val_accuracy: 0.6955 - val_precision_5: 0.8222 - val_recall_5: 0.4568 Epoch 13/50 81/81 [==============================] - 25s 306ms/step - loss: 0.8275 - accuracy: 0.6162 - precision_5: 0.7035 - recall_5: 0.5286 - val_loss: 0.8548 - val_accuracy: 0.7160 - val_precision_5: 0.8393 - val_recall_5: 0.3868 Epoch 14/50 81/81 [==============================] - 25s 307ms/step - loss: 0.8474 - accuracy: 0.6087 - precision_5: 0.6693 - recall_5: 0.4962 - val_loss: 0.8523 - val_accuracy: 0.7119 - val_precision_5: 0.8190 - val_recall_5: 0.3909 Epoch 15/50 81/81 [==============================] - 25s 305ms/step - loss: 0.7752 - accuracy: 0.6788 - precision_5: 0.7317 - recall_5: 0.5765 - val_loss: 0.8319 - val_accuracy: 0.6955 - val_precision_5: 0.7947 - val_recall_5: 0.4938 Epoch 16/50 81/81 [==============================] - 25s 301ms/step - loss: 0.8688 - accuracy: 0.5878 - precision_5: 0.6451 - recall_5: 0.4851 - val_loss: 0.8144 - val_accuracy: 0.7202 - val_precision_5: 0.8291 - val_recall_5: 0.5391 Epoch 17/50 81/81 [==============================] - 25s 307ms/step - loss: 0.7878 - accuracy: 0.6439 - precision_5: 0.7045 - recall_5: 0.5606 - val_loss: 0.8539 - val_accuracy: 0.7119 - val_precision_5: 0.8131 - val_recall_5: 0.3580 Epoch 18/50 81/81 [==============================] - 25s 309ms/step - loss: 0.8085 - accuracy: 0.6384 - precision_5: 0.7166 - recall_5: 0.5453 - val_loss: 0.8031 - val_accuracy: 0.7037 - val_precision_5: 0.8061 - val_recall_5: 0.5473 Epoch 19/50 81/81 [==============================] - 25s 309ms/step - loss: 0.8015 - accuracy: 0.6256 - precision_5: 0.6833 - recall_5: 0.5472 - val_loss: 0.8009 - val_accuracy: 0.7407 - val_precision_5: 0.7967 - val_recall_5: 0.4033 Epoch 20/50 81/81 [==============================] - 25s 308ms/step - loss: 0.8066 - accuracy: 0.6680 - precision_5: 0.7108 - recall_5: 0.5659 - val_loss: 0.8128 - val_accuracy: 0.7202 - val_precision_5: 0.7853 - val_recall_5: 0.5267 Epoch 21/50 81/81 [==============================] - 25s 308ms/step - loss: 0.8165 - accuracy: 0.6468 - precision_5: 0.6899 - recall_5: 0.5207 - val_loss: 0.7648 - val_accuracy: 0.7407 - val_precision_5: 0.8011 - val_recall_5: 0.6132 Epoch 22/50 81/81 [==============================] - 25s 307ms/step - loss: 0.7754 - accuracy: 0.6660 - precision_5: 0.7181 - recall_5: 0.5308 - val_loss: 0.7301 - val_accuracy: 0.7531 - val_precision_5: 0.8172 - val_recall_5: 0.6255 Epoch 23/50 81/81 [==============================] - 25s 306ms/step - loss: 0.7075 - accuracy: 0.7232 - precision_5: 0.7686 - recall_5: 0.6194 - val_loss: 0.6666 - val_accuracy: 0.7860 - val_precision_5: 0.8600 - val_recall_5: 0.7078 Epoch 24/50 81/81 [==============================] - 25s 306ms/step - loss: 0.7035 - accuracy: 0.7213 - precision_5: 0.7670 - recall_5: 0.6253 - val_loss: 0.6873 - val_accuracy: 0.8107 - val_precision_5: 0.8717 - val_recall_5: 0.6708 Epoch 25/50 81/81 [==============================] - 25s 306ms/step - loss: 0.6760 - accuracy: 0.7462 - precision_5: 0.7628 - recall_5: 0.6409 - val_loss: 0.6898 - val_accuracy: 0.7572 - val_precision_5: 0.8301 - val_recall_5: 0.7037 Epoch 26/50 81/81 [==============================] - 25s 306ms/step - loss: 0.7811 - accuracy: 0.6751 - precision_5: 0.7237 - recall_5: 0.5713 - val_loss: 0.7191 - val_accuracy: 0.7654 - val_precision_5: 0.8109 - val_recall_5: 0.6708 Epoch 27/50 81/81 [==============================] - 25s 305ms/step - loss: 0.7135 - accuracy: 0.7292 - precision_5: 0.7632 - recall_5: 0.6152 - val_loss: 0.6865 - val_accuracy: 0.7613 - val_precision_5: 0.8449 - val_recall_5: 0.6502 Epoch 28/50 81/81 [==============================] - 25s 305ms/step - loss: 0.6911 - accuracy: 0.7279 - precision_5: 0.7693 - recall_5: 0.6530 - val_loss: 0.5966 - val_accuracy: 0.8477 - val_precision_5: 0.8986 - val_recall_5: 0.7654 Epoch 29/50 81/81 [==============================] - 25s 304ms/step - loss: 0.6201 - accuracy: 0.7523 - precision_5: 0.7965 - recall_5: 0.7156 - val_loss: 0.5869 - val_accuracy: 0.8436 - val_precision_5: 0.8884 - val_recall_5: 0.7860 Epoch 30/50 81/81 [==============================] - 25s 305ms/step - loss: 0.6688 - accuracy: 0.7389 - precision_5: 0.7842 - recall_5: 0.6922 - val_loss: 0.5821 - val_accuracy: 0.8477 - val_precision_5: 0.9307 - val_recall_5: 0.7737 Epoch 31/50 81/81 [==============================] - 25s 306ms/step - loss: 0.6367 - accuracy: 0.7591 - precision_5: 0.8050 - recall_5: 0.7113 - val_loss: 0.5662 - val_accuracy: 0.8395 - val_precision_5: 0.8910 - val_recall_5: 0.7737 Epoch 32/50 81/81 [==============================] - 25s 307ms/step - loss: 0.6532 - accuracy: 0.7327 - precision_5: 0.7620 - recall_5: 0.6732 - val_loss: 0.5356 - val_accuracy: 0.8560 - val_precision_5: 0.8955 - val_recall_5: 0.8107 Epoch 33/50 81/81 [==============================] - 25s 308ms/step - loss: 0.6250 - accuracy: 0.7555 - precision_5: 0.8083 - recall_5: 0.7067 - val_loss: 0.5163 - val_accuracy: 0.8519 - val_precision_5: 0.9000 - val_recall_5: 0.8148 Epoch 34/50 81/81 [==============================] - 25s 308ms/step - loss: 0.6742 - accuracy: 0.7037 - precision_5: 0.7572 - recall_5: 0.6479 - val_loss: 0.5920 - val_accuracy: 0.8230 - val_precision_5: 0.8696 - val_recall_5: 0.7407 Epoch 35/50 81/81 [==============================] - 25s 309ms/step - loss: 0.6289 - accuracy: 0.7332 - precision_5: 0.7830 - recall_5: 0.6889 - val_loss: 0.5878 - val_accuracy: 0.8107 - val_precision_5: 0.9045 - val_recall_5: 0.7407 Epoch 36/50 81/81 [==============================] - 25s 307ms/step - loss: 0.6081 - accuracy: 0.7610 - precision_5: 0.8064 - recall_5: 0.6873 - val_loss: 0.4899 - val_accuracy: 0.8848 - val_precision_5: 0.9182 - val_recall_5: 0.8313 Epoch 37/50 81/81 [==============================] - 25s 306ms/step - loss: 0.5779 - accuracy: 0.7438 - precision_5: 0.7975 - recall_5: 0.7105 - val_loss: 0.4812 - val_accuracy: 0.8807 - val_precision_5: 0.9123 - val_recall_5: 0.8560 Epoch 38/50 81/81 [==============================] - 25s 305ms/step - loss: 0.6005 - accuracy: 0.7406 - precision_5: 0.7948 - recall_5: 0.6989 - val_loss: 0.4371 - val_accuracy: 0.8889 - val_precision_5: 0.9035 - val_recall_5: 0.8477 Epoch 39/50 81/81 [==============================] - 25s 305ms/step - loss: 0.5525 - accuracy: 0.7732 - precision_5: 0.8067 - recall_5: 0.7236 - val_loss: 0.4704 - val_accuracy: 0.8724 - val_precision_5: 0.9018 - val_recall_5: 0.8313 Epoch 40/50 81/81 [==============================] - 25s 298ms/step - loss: 0.6057 - accuracy: 0.7562 - precision_5: 0.7900 - recall_5: 0.6976 - val_loss: 0.4485 - val_accuracy: 0.8683 - val_precision_5: 0.9182 - val_recall_5: 0.8313 Epoch 41/50 81/81 [==============================] - 25s 304ms/step - loss: 0.5608 - accuracy: 0.7739 - precision_5: 0.8207 - recall_5: 0.7061 - val_loss: 0.4218 - val_accuracy: 0.9053 - val_precision_5: 0.9204 - val_recall_5: 0.8560 Epoch 42/50 81/81 [==============================] - 25s 306ms/step - loss: 0.5101 - accuracy: 0.7813 - precision_5: 0.8350 - recall_5: 0.7404 - val_loss: 0.5258 - val_accuracy: 0.8436 - val_precision_5: 0.8676 - val_recall_5: 0.7819 Epoch 43/50 81/81 [==============================] - 25s 304ms/step - loss: 0.5520 - accuracy: 0.7543 - precision_5: 0.8040 - recall_5: 0.7147 - val_loss: 0.4381 - val_accuracy: 0.8765 - val_precision_5: 0.9022 - val_recall_5: 0.8354 Epoch 44/50 81/81 [==============================] - 25s 308ms/step - loss: 0.5398 - accuracy: 0.7982 - precision_5: 0.8255 - recall_5: 0.7414 - val_loss: 0.4453 - val_accuracy: 0.8519 - val_precision_5: 0.8777 - val_recall_5: 0.8272 Epoch 45/50 81/81 [==============================] - 25s 306ms/step - loss: 0.5078 - accuracy: 0.7896 - precision_5: 0.8314 - recall_5: 0.7524 - val_loss: 0.4376 - val_accuracy: 0.8889 - val_precision_5: 0.9159 - val_recall_5: 0.8519 Epoch 46/50 81/81 [==============================] - 25s 305ms/step - loss: 0.4909 - accuracy: 0.8075 - precision_5: 0.8445 - recall_5: 0.7651 - val_loss: 0.4061 - val_accuracy: 0.8889 - val_precision_5: 0.9130 - val_recall_5: 0.8642 Epoch 47/50 81/81 [==============================] - 25s 305ms/step - loss: 0.4900 - accuracy: 0.8192 - precision_5: 0.8481 - recall_5: 0.7747 - val_loss: 0.4003 - val_accuracy: 0.8848 - val_precision_5: 0.8991 - val_recall_5: 0.8436 Epoch 48/50 81/81 [==============================] - 25s 307ms/step - loss: 0.5481 - accuracy: 0.7683 - precision_5: 0.8051 - recall_5: 0.7394 - val_loss: 0.3911 - val_accuracy: 0.9053 - val_precision_5: 0.9289 - val_recall_5: 0.8601 Epoch 49/50 81/81 [==============================] - 25s 307ms/step - loss: 0.4533 - accuracy: 0.8252 - precision_5: 0.8582 - recall_5: 0.8051 - val_loss: 0.3948 - val_accuracy: 0.8971 - val_precision_5: 0.9298 - val_recall_5: 0.8724 Epoch 50/50 81/81 [==============================] - 25s 306ms/step - loss: 0.5347 - accuracy: 0.7582 - precision_5: 0.8166 - recall_5: 0.7166 - val_loss: 0.3616 - val_accuracy: 0.9218 - val_precision_5: 0.9440 - val_recall_5: 0.9012
model_1.summary()
Model: "sequential_8" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= conv2d_18 (Conv2D) (None, 254, 254, 16) 448 _________________________________________________________________ max_pooling2d_18 (MaxPooling (None, 50, 50, 16) 0 _________________________________________________________________ dropout_3 (Dropout) (None, 50, 50, 16) 0 _________________________________________________________________ conv2d_19 (Conv2D) (None, 48, 48, 8) 1160 _________________________________________________________________ max_pooling2d_19 (MaxPooling (None, 9, 9, 8) 0 _________________________________________________________________ dropout_4 (Dropout) (None, 9, 9, 8) 0 _________________________________________________________________ conv2d_20 (Conv2D) (None, 7, 7, 4) 292 _________________________________________________________________ max_pooling2d_20 (MaxPooling (None, 1, 1, 4) 0 _________________________________________________________________ dropout_5 (Dropout) (None, 1, 1, 4) 0 _________________________________________________________________ flatten_6 (Flatten) (None, 4) 0 _________________________________________________________________ dense_12 (Dense) (None, 5) 25 _________________________________________________________________ dense_13 (Dense) (None, 3) 18 ================================================================= Total params: 1,943 Trainable params: 1,943 Non-trainable params: 0 _________________________________________________________________
plt.subplots_adjust(right=1.95, left=.03)
plt.subplot(1,3,1)
plt.plot(fit_history_1.history['accuracy'])
plt.plot(fit_history_1.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(fit_history_1.history['precision_5'])
plt.plot(fit_history_1.history['val_precision_5'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
#plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,3)
plt.plot(fit_history_1.history['recall_5'])
plt.plot(fit_history_1.history['val_recall_5'])
plt.ylabel('Recall')
plt.xlabel('')
#plt.legend(['training','validation'], loc="lower right")
plt.show()
test_loss, test_acc, test_precision, test_recall = model_1.evaluate(testing_data)
print('%s %.2f' % ('validation_acc: ', test_acc*100.0 ))
print('%s %.2f' % ('validation_loss:', test_loss ))
print('%s %.2f' % ('validation_precision:', test_precision ))
print('%s %.2f' % ('validation_recall:', test_recall ))
27/27 [==============================] - 6s 234ms/step - loss: 0.3913 - accuracy: 0.9136 - precision_5: 0.9181 - recall_5: 0.8765 validation_acc: 91.36 validation_loss: 0.39 validation_precision: 0.92 validation_recall: 0.88
del model_1
Regularization #3
We will next try adding some L2 regularizations.
model_1 = Sequential()
model_1.add( Conv2D(filters=16, kernel_size=3, kernel_regularizer="l2", activation = 'relu', input_shape = training_data.image_shape ) )
model_1.add( MaxPool2D(5,5))
model_1.add( Conv2D(filters=8, kernel_size=3, kernel_regularizer="l2", activation = 'relu' ) )
model_1.add( MaxPool2D(5,5))
model_1.add( Conv2D(filters=4, kernel_size=3, kernel_regularizer="l2", activation = 'relu' ) )
model_1.add( MaxPool2D(5,5))
model_1.add( Flatten())
model_1.add( Dense(units=5, activation = 'relu' ) )
model_1.add( Dense(units=3, activation = 'softmax' ) )
model_1.compile( optimizer = 'adam', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
fit_history_1 = model_1.fit( training_data, validation_data = validation_data, epochs = 50, batch_size = 64, callbacks = [callback_earlystp] )
Epoch 1/50 81/81 [==============================] - 28s 334ms/step - loss: 1.2639 - accuracy: 0.3836 - precision_6: 0.0000e+00 - recall_6: 0.0000e+00 - val_loss: 1.1751 - val_accuracy: 0.5226 - val_precision_6: 0.0000e+00 - val_recall_6: 0.0000e+00 Epoch 2/50 81/81 [==============================] - 25s 316ms/step - loss: 1.1552 - accuracy: 0.4380 - precision_6: 0.0000e+00 - recall_6: 0.0000e+00 - val_loss: 1.1166 - val_accuracy: 0.5309 - val_precision_6: 0.0000e+00 - val_recall_6: 0.0000e+00 Epoch 3/50 81/81 [==============================] - 25s 306ms/step - loss: 1.1014 - accuracy: 0.5329 - precision_6: 0.3427 - recall_6: 0.0066 - val_loss: 1.0625 - val_accuracy: 0.5267 - val_precision_6: 0.7568 - val_recall_6: 0.1152 Epoch 4/50 81/81 [==============================] - 25s 310ms/step - loss: 1.0393 - accuracy: 0.5740 - precision_6: 0.7310 - recall_6: 0.1140 - val_loss: 0.9909 - val_accuracy: 0.5761 - val_precision_6: 0.7846 - val_recall_6: 0.2099 Epoch 5/50 81/81 [==============================] - 25s 312ms/step - loss: 0.9699 - accuracy: 0.5701 - precision_6: 0.7729 - recall_6: 0.2352 - val_loss: 0.9265 - val_accuracy: 0.5844 - val_precision_6: 0.7091 - val_recall_6: 0.3210 Epoch 6/50 81/81 [==============================] - 25s 315ms/step - loss: 0.8742 - accuracy: 0.6070 - precision_6: 0.7724 - recall_6: 0.4440 - val_loss: 0.8793 - val_accuracy: 0.6255 - val_precision_6: 0.7421 - val_recall_6: 0.4856 Epoch 7/50 81/81 [==============================] - 26s 321ms/step - loss: 0.8497 - accuracy: 0.6701 - precision_6: 0.7956 - recall_6: 0.5168 - val_loss: 0.8708 - val_accuracy: 0.6255 - val_precision_6: 0.7450 - val_recall_6: 0.4568 Epoch 8/50 81/81 [==============================] - 25s 313ms/step - loss: 0.8284 - accuracy: 0.6699 - precision_6: 0.7506 - recall_6: 0.4804 - val_loss: 0.7705 - val_accuracy: 0.7037 - val_precision_6: 0.8072 - val_recall_6: 0.5514 Epoch 9/50 81/81 [==============================] - 25s 309ms/step - loss: 0.7398 - accuracy: 0.7275 - precision_6: 0.8086 - recall_6: 0.5563 - val_loss: 0.7459 - val_accuracy: 0.7572 - val_precision_6: 0.7963 - val_recall_6: 0.5309 Epoch 10/50 81/81 [==============================] - 25s 307ms/step - loss: 0.7116 - accuracy: 0.7366 - precision_6: 0.8279 - recall_6: 0.5705 - val_loss: 0.6950 - val_accuracy: 0.7860 - val_precision_6: 0.8352 - val_recall_6: 0.6255 Epoch 11/50 81/81 [==============================] - 25s 312ms/step - loss: 0.6423 - accuracy: 0.7978 - precision_6: 0.8764 - recall_6: 0.6519 - val_loss: 0.6409 - val_accuracy: 0.8477 - val_precision_6: 0.9000 - val_recall_6: 0.7407 Epoch 12/50 81/81 [==============================] - 25s 311ms/step - loss: 0.6270 - accuracy: 0.8131 - precision_6: 0.8756 - recall_6: 0.7378 - val_loss: 0.6300 - val_accuracy: 0.8107 - val_precision_6: 0.8396 - val_recall_6: 0.7325 Epoch 13/50 81/81 [==============================] - 25s 310ms/step - loss: 0.6305 - accuracy: 0.8068 - precision_6: 0.8643 - recall_6: 0.7264 - val_loss: 0.6047 - val_accuracy: 0.8519 - val_precision_6: 0.8783 - val_recall_6: 0.8313 Epoch 14/50 81/81 [==============================] - 25s 309ms/step - loss: 0.5506 - accuracy: 0.8316 - precision_6: 0.8857 - recall_6: 0.7887 - val_loss: 0.5561 - val_accuracy: 0.8272 - val_precision_6: 0.8774 - val_recall_6: 0.7654 Epoch 15/50 81/81 [==============================] - 25s 310ms/step - loss: 0.5251 - accuracy: 0.8357 - precision_6: 0.8853 - recall_6: 0.8025 - val_loss: 0.5585 - val_accuracy: 0.8313 - val_precision_6: 0.8660 - val_recall_6: 0.7449 Epoch 16/50 81/81 [==============================] - 25s 312ms/step - loss: 0.4879 - accuracy: 0.8571 - precision_6: 0.8861 - recall_6: 0.8201 - val_loss: 0.5153 - val_accuracy: 0.8519 - val_precision_6: 0.8778 - val_recall_6: 0.7984 Epoch 17/50 81/81 [==============================] - 25s 314ms/step - loss: 0.4815 - accuracy: 0.8494 - precision_6: 0.8829 - recall_6: 0.8087 - val_loss: 0.4736 - val_accuracy: 0.9053 - val_precision_6: 0.9114 - val_recall_6: 0.8889 Epoch 18/50 81/81 [==============================] - 25s 311ms/step - loss: 0.4683 - accuracy: 0.8587 - precision_6: 0.8819 - recall_6: 0.8235 - val_loss: 0.4540 - val_accuracy: 0.9012 - val_precision_6: 0.9191 - val_recall_6: 0.8889 Epoch 19/50 81/81 [==============================] - 25s 312ms/step - loss: 0.4785 - accuracy: 0.8471 - precision_6: 0.8765 - recall_6: 0.8198 - val_loss: 0.4662 - val_accuracy: 0.8889 - val_precision_6: 0.8961 - val_recall_6: 0.8519 Epoch 20/50 81/81 [==============================] - 25s 311ms/step - loss: 0.4112 - accuracy: 0.8866 - precision_6: 0.8986 - recall_6: 0.8468 - val_loss: 0.4641 - val_accuracy: 0.8807 - val_precision_6: 0.8884 - val_recall_6: 0.8519 Epoch 21/50 81/81 [==============================] - 25s 312ms/step - loss: 0.4403 - accuracy: 0.8815 - precision_6: 0.9130 - recall_6: 0.8603 - val_loss: 0.4748 - val_accuracy: 0.8765 - val_precision_6: 0.8987 - val_recall_6: 0.8395 Epoch 22/50 81/81 [==============================] - 25s 312ms/step - loss: 0.4329 - accuracy: 0.8821 - precision_6: 0.9082 - recall_6: 0.8606 - val_loss: 0.5136 - val_accuracy: 0.8354 - val_precision_6: 0.8534 - val_recall_6: 0.8148 Epoch 23/50 81/81 [==============================] - 25s 315ms/step - loss: 0.4154 - accuracy: 0.8709 - precision_6: 0.8920 - recall_6: 0.8539 - val_loss: 0.4215 - val_accuracy: 0.9095 - val_precision_6: 0.9224 - val_recall_6: 0.8807 Epoch 24/50 81/81 [==============================] - 26s 315ms/step - loss: 0.4392 - accuracy: 0.8706 - precision_6: 0.9036 - recall_6: 0.8421 - val_loss: 0.4174 - val_accuracy: 0.9012 - val_precision_6: 0.9142 - val_recall_6: 0.8765 Epoch 25/50 81/81 [==============================] - 25s 314ms/step - loss: 0.3883 - accuracy: 0.8915 - precision_6: 0.8972 - recall_6: 0.8734 - val_loss: 0.4879 - val_accuracy: 0.8601 - val_precision_6: 0.8703 - val_recall_6: 0.8560 Epoch 26/50 81/81 [==============================] - 25s 311ms/step - loss: 0.4511 - accuracy: 0.8563 - precision_6: 0.8930 - recall_6: 0.8249 - val_loss: 0.4209 - val_accuracy: 0.9053 - val_precision_6: 0.9110 - val_recall_6: 0.8848 Epoch 27/50 81/81 [==============================] - 25s 311ms/step - loss: 0.3769 - accuracy: 0.8992 - precision_6: 0.9096 - recall_6: 0.8831 - val_loss: 0.4247 - val_accuracy: 0.8930 - val_precision_6: 0.9110 - val_recall_6: 0.8848 Epoch 28/50 81/81 [==============================] - 25s 306ms/step - loss: 0.3863 - accuracy: 0.8886 - precision_6: 0.9017 - recall_6: 0.8699 - val_loss: 0.4126 - val_accuracy: 0.8971 - val_precision_6: 0.9076 - val_recall_6: 0.8889 Epoch 29/50 81/81 [==============================] - 25s 311ms/step - loss: 0.3907 - accuracy: 0.9028 - precision_6: 0.9172 - recall_6: 0.8846 - val_loss: 0.4454 - val_accuracy: 0.8765 - val_precision_6: 0.8894 - val_recall_6: 0.8601 Epoch 30/50 81/81 [==============================] - 26s 321ms/step - loss: 0.3606 - accuracy: 0.9030 - precision_6: 0.9141 - recall_6: 0.8789 - val_loss: 0.4296 - val_accuracy: 0.8807 - val_precision_6: 0.8974 - val_recall_6: 0.8642 Epoch 31/50 81/81 [==============================] - 26s 327ms/step - loss: 0.3706 - accuracy: 0.9038 - precision_6: 0.9143 - recall_6: 0.8792 - val_loss: 0.4129 - val_accuracy: 0.9012 - val_precision_6: 0.9153 - val_recall_6: 0.8889 Epoch 32/50 81/81 [==============================] - 26s 319ms/step - loss: 0.3548 - accuracy: 0.9002 - precision_6: 0.9052 - recall_6: 0.8834 - val_loss: 0.4064 - val_accuracy: 0.9012 - val_precision_6: 0.9149 - val_recall_6: 0.8848 Epoch 33/50 81/81 [==============================] - 25s 313ms/step - loss: 0.3553 - accuracy: 0.8979 - precision_6: 0.9120 - recall_6: 0.8863 - val_loss: 0.4653 - val_accuracy: 0.8477 - val_precision_6: 0.8590 - val_recall_6: 0.8272 Epoch 34/50 81/81 [==============================] - 25s 314ms/step - loss: 0.3367 - accuracy: 0.8983 - precision_6: 0.9133 - recall_6: 0.8892 - val_loss: 0.4021 - val_accuracy: 0.9095 - val_precision_6: 0.9156 - val_recall_6: 0.8930 Epoch 35/50 81/81 [==============================] - 25s 314ms/step - loss: 0.3293 - accuracy: 0.8953 - precision_6: 0.9198 - recall_6: 0.8848 - val_loss: 0.4158 - val_accuracy: 0.9095 - val_precision_6: 0.9153 - val_recall_6: 0.8889 Epoch 36/50 81/81 [==============================] - 25s 313ms/step - loss: 0.3334 - accuracy: 0.9195 - precision_6: 0.9276 - recall_6: 0.9043 - val_loss: 0.4112 - val_accuracy: 0.9012 - val_precision_6: 0.9110 - val_recall_6: 0.8848 Epoch 37/50 81/81 [==============================] - 25s 313ms/step - loss: 0.3425 - accuracy: 0.9044 - precision_6: 0.9172 - recall_6: 0.8911 - val_loss: 0.3998 - val_accuracy: 0.8930 - val_precision_6: 0.8996 - val_recall_6: 0.8848 Epoch 38/50 81/81 [==============================] - 25s 312ms/step - loss: 0.3202 - accuracy: 0.9272 - precision_6: 0.9385 - recall_6: 0.9092 - val_loss: 0.4443 - val_accuracy: 0.8930 - val_precision_6: 0.9034 - val_recall_6: 0.8848 Epoch 39/50 81/81 [==============================] - 25s 307ms/step - loss: 0.3757 - accuracy: 0.8972 - precision_6: 0.9138 - recall_6: 0.8844 - val_loss: 0.3958 - val_accuracy: 0.8930 - val_precision_6: 0.9072 - val_recall_6: 0.8848 Epoch 40/50 81/81 [==============================] - 25s 310ms/step - loss: 0.3184 - accuracy: 0.9154 - precision_6: 0.9352 - recall_6: 0.9031 - val_loss: 0.3937 - val_accuracy: 0.9177 - val_precision_6: 0.9325 - val_recall_6: 0.9095 Epoch 41/50 81/81 [==============================] - 25s 312ms/step - loss: 0.3473 - accuracy: 0.9079 - precision_6: 0.9204 - recall_6: 0.8905 - val_loss: 0.4137 - val_accuracy: 0.8971 - val_precision_6: 0.9076 - val_recall_6: 0.8889 Epoch 42/50 81/81 [==============================] - 26s 326ms/step - loss: 0.3425 - accuracy: 0.9088 - precision_6: 0.9160 - recall_6: 0.8865 - val_loss: 0.4127 - val_accuracy: 0.9012 - val_precision_6: 0.9118 - val_recall_6: 0.8930 Epoch 43/50 81/81 [==============================] - 27s 330ms/step - loss: 0.3542 - accuracy: 0.8962 - precision_6: 0.9147 - recall_6: 0.8833 - val_loss: 0.4224 - val_accuracy: 0.9012 - val_precision_6: 0.9042 - val_recall_6: 0.8930 Epoch 44/50 81/81 [==============================] - 26s 319ms/step - loss: 0.3436 - accuracy: 0.8966 - precision_6: 0.9216 - recall_6: 0.8871 - val_loss: 0.4048 - val_accuracy: 0.8765 - val_precision_6: 0.8828 - val_recall_6: 0.8683 Epoch 45/50 81/81 [==============================] - 25s 316ms/step - loss: 0.3340 - accuracy: 0.9027 - precision_6: 0.9072 - recall_6: 0.8873 - val_loss: 0.4015 - val_accuracy: 0.9012 - val_precision_6: 0.9114 - val_recall_6: 0.8889 Epoch 46/50 81/81 [==============================] - 25s 316ms/step - loss: 0.3292 - accuracy: 0.9115 - precision_6: 0.9265 - recall_6: 0.9046 - val_loss: 0.3960 - val_accuracy: 0.9012 - val_precision_6: 0.9076 - val_recall_6: 0.8889 Epoch 47/50 81/81 [==============================] - 25s 311ms/step - loss: 0.3541 - accuracy: 0.9013 - precision_6: 0.9045 - recall_6: 0.8951 - val_loss: 0.4065 - val_accuracy: 0.8930 - val_precision_6: 0.9072 - val_recall_6: 0.8848 Epoch 48/50 81/81 [==============================] - 25s 311ms/step - loss: 0.3079 - accuracy: 0.9152 - precision_6: 0.9278 - recall_6: 0.9043 - val_loss: 0.4368 - val_accuracy: 0.8765 - val_precision_6: 0.8898 - val_recall_6: 0.8642 Epoch 49/50 81/81 [==============================] - 25s 314ms/step - loss: 0.3576 - accuracy: 0.8951 - precision_6: 0.9087 - recall_6: 0.8766 - val_loss: 0.3951 - val_accuracy: 0.8971 - val_precision_6: 0.9034 - val_recall_6: 0.8848 Epoch 50/50 81/81 [==============================] - 25s 313ms/step - loss: 0.3226 - accuracy: 0.9073 - precision_6: 0.9190 - recall_6: 0.8991 - val_loss: 0.4015 - val_accuracy: 0.9053 - val_precision_6: 0.9121 - val_recall_6: 0.8971
model_1.summary()
Model: "sequential_9" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= conv2d_21 (Conv2D) (None, 254, 254, 16) 448 _________________________________________________________________ max_pooling2d_21 (MaxPooling (None, 50, 50, 16) 0 _________________________________________________________________ conv2d_22 (Conv2D) (None, 48, 48, 8) 1160 _________________________________________________________________ max_pooling2d_22 (MaxPooling (None, 9, 9, 8) 0 _________________________________________________________________ conv2d_23 (Conv2D) (None, 7, 7, 4) 292 _________________________________________________________________ max_pooling2d_23 (MaxPooling (None, 1, 1, 4) 0 _________________________________________________________________ flatten_7 (Flatten) (None, 4) 0 _________________________________________________________________ dense_14 (Dense) (None, 5) 25 _________________________________________________________________ dense_15 (Dense) (None, 3) 18 ================================================================= Total params: 1,943 Trainable params: 1,943 Non-trainable params: 0 _________________________________________________________________
plt.subplots_adjust(right=1.95, left=.03)
plt.subplot(1,3,1)
plt.plot(fit_history_1.history['accuracy'])
plt.plot(fit_history_1.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(fit_history_1.history['precision_6'])
plt.plot(fit_history_1.history['val_precision_6'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
#plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,3)
plt.plot(fit_history_1.history['recall_6'])
plt.plot(fit_history_1.history['val_recall_6'])
plt.ylabel('Recall')
plt.xlabel('')
#plt.legend(['training','validation'], loc="lower right")
plt.show()
test_loss, test_acc, test_precision, test_recall = model_1.evaluate(testing_data)
print('%s %.2f' % ('validation_acc: ', test_acc*100.0 ))
print('%s %.2f' % ('validation_loss:', test_loss ))
print('%s %.2f' % ('validation_precision:', test_precision ))
print('%s %.2f' % ('validation_recall:', test_recall ))
27/27 [==============================] - 6s 214ms/step - loss: 0.4841 - accuracy: 0.8642 - precision_6: 0.8841 - recall_6: 0.8477 validation_acc: 86.42 validation_loss: 0.48 validation_precision: 0.88 validation_recall: 0.85
Regularization #4
model_1 = Sequential()
model_1.add( BatchNormalization())
model_1.add( Conv2D(filters=16, kernel_size=3, activation = 'relu', input_shape = training_data.image_shape ) )
model_1.add( MaxPool2D(5,5))
model_1.add( Dropout(rate=0.2))
model_1.add( BatchNormalization())
model_1.add( Conv2D(filters=8, kernel_size=3, activation = 'relu' ) )
model_1.add( MaxPool2D(5,5))
model_1.add( Dropout(rate=0.2))
model_1.add( BatchNormalization())
model_1.add( Conv2D(filters=4, kernel_size=3, activation = 'relu' ) )
model_1.add( MaxPool2D(5,5))
model_1.add( Dropout(rate=0.2))
model_1.add( Flatten())
model_1.add( Dense(units=5, activation = 'relu' ) )
model_1.add( Dense(units=3, activation = 'softmax' ) )
model_1.compile( optimizer = 'adam', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
fit_history_1 = model_1.fit( training_data, validation_data = validation_data, epochs = 50, batch_size = 64, callbacks = [callback_earlystp] )
Epoch 1/50 81/81 [==============================] - 27s 315ms/step - loss: 1.9348 - accuracy: 0.3709 - precision: 0.3724 - recall: 0.3477 - val_loss: 1.1051 - val_accuracy: 0.3128 - val_precision: 0.0000e+00 - val_recall: 0.0000e+00 Epoch 2/50 81/81 [==============================] - 25s 316ms/step - loss: 1.2090 - accuracy: 0.4144 - precision: 0.4787 - recall: 0.3072 - val_loss: 1.0754 - val_accuracy: 0.3128 - val_precision: 0.0000e+00 - val_recall: 0.0000e+00 Epoch 3/50 81/81 [==============================] - 25s 311ms/step - loss: 0.9750 - accuracy: 0.5121 - precision: 0.6807 - recall: 0.3118 - val_loss: 1.0381 - val_accuracy: 0.4733 - val_precision: 1.0000 - val_recall: 0.0206 Epoch 4/50 81/81 [==============================] - 25s 311ms/step - loss: 0.8946 - accuracy: 0.5907 - precision: 0.7611 - recall: 0.3093 - val_loss: 0.9620 - val_accuracy: 0.5844 - val_precision: 0.7429 - val_recall: 0.1070 Epoch 5/50 81/81 [==============================] - 25s 313ms/step - loss: 0.8658 - accuracy: 0.6483 - precision: 0.7405 - recall: 0.3811 - val_loss: 0.8704 - val_accuracy: 0.6996 - val_precision: 0.8060 - val_recall: 0.2222 Epoch 6/50 81/81 [==============================] - 25s 309ms/step - loss: 0.8065 - accuracy: 0.6816 - precision: 0.7968 - recall: 0.4248 - val_loss: 0.8157 - val_accuracy: 0.6872 - val_precision: 0.7376 - val_recall: 0.4280 Epoch 7/50 81/81 [==============================] - 25s 309ms/step - loss: 0.8011 - accuracy: 0.6719 - precision: 0.7507 - recall: 0.4991 - val_loss: 0.7720 - val_accuracy: 0.6996 - val_precision: 0.8246 - val_recall: 0.5802 Epoch 8/50 81/81 [==============================] - 25s 308ms/step - loss: 0.7502 - accuracy: 0.6960 - precision: 0.7692 - recall: 0.5637 - val_loss: 0.7392 - val_accuracy: 0.7160 - val_precision: 0.7720 - val_recall: 0.6132 Epoch 9/50 81/81 [==============================] - 25s 310ms/step - loss: 0.7555 - accuracy: 0.6855 - precision: 0.7361 - recall: 0.6048 - val_loss: 0.7272 - val_accuracy: 0.7366 - val_precision: 0.7711 - val_recall: 0.6379 Epoch 10/50 81/81 [==============================] - 25s 310ms/step - loss: 0.7207 - accuracy: 0.7045 - precision: 0.7509 - recall: 0.6436 - val_loss: 0.6697 - val_accuracy: 0.7613 - val_precision: 0.7981 - val_recall: 0.6996 Epoch 11/50 81/81 [==============================] - 25s 309ms/step - loss: 0.7491 - accuracy: 0.6789 - precision: 0.7223 - recall: 0.6201 - val_loss: 0.6254 - val_accuracy: 0.8066 - val_precision: 0.8380 - val_recall: 0.7449 Epoch 12/50 81/81 [==============================] - 25s 308ms/step - loss: 0.7012 - accuracy: 0.7250 - precision: 0.7889 - recall: 0.6691 - val_loss: 0.6439 - val_accuracy: 0.7860 - val_precision: 0.8284 - val_recall: 0.6955 Epoch 13/50 81/81 [==============================] - 25s 308ms/step - loss: 0.7412 - accuracy: 0.6925 - precision: 0.7208 - recall: 0.6317 - val_loss: 0.5684 - val_accuracy: 0.8313 - val_precision: 0.8578 - val_recall: 0.7695 Epoch 14/50 81/81 [==============================] - 25s 308ms/step - loss: 0.6939 - accuracy: 0.7179 - precision: 0.7474 - recall: 0.6719 - val_loss: 0.6473 - val_accuracy: 0.7490 - val_precision: 0.7844 - val_recall: 0.7037 Epoch 15/50 81/81 [==============================] - 25s 312ms/step - loss: 0.6808 - accuracy: 0.7412 - precision: 0.7763 - recall: 0.6904 - val_loss: 0.5489 - val_accuracy: 0.8272 - val_precision: 0.8559 - val_recall: 0.7819 Epoch 16/50 81/81 [==============================] - 25s 309ms/step - loss: 0.6940 - accuracy: 0.7223 - precision: 0.7507 - recall: 0.6739 - val_loss: 0.6583 - val_accuracy: 0.7449 - val_precision: 0.7736 - val_recall: 0.6749 Epoch 17/50 81/81 [==============================] - 25s 310ms/step - loss: 0.6708 - accuracy: 0.7301 - precision: 0.7705 - recall: 0.6843 - val_loss: 0.5414 - val_accuracy: 0.8519 - val_precision: 0.8899 - val_recall: 0.7984 Epoch 18/50 81/81 [==============================] - 25s 307ms/step - loss: 0.6442 - accuracy: 0.7377 - precision: 0.8013 - recall: 0.6966 - val_loss: 0.5290 - val_accuracy: 0.8313 - val_precision: 0.8688 - val_recall: 0.7901 Epoch 19/50 81/81 [==============================] - 25s 313ms/step - loss: 0.6461 - accuracy: 0.7506 - precision: 0.7849 - recall: 0.7159 - val_loss: 0.5509 - val_accuracy: 0.8313 - val_precision: 0.8676 - val_recall: 0.7819 Epoch 20/50 81/81 [==============================] - 25s 313ms/step - loss: 0.6172 - accuracy: 0.7496 - precision: 0.7989 - recall: 0.7171 - val_loss: 0.5286 - val_accuracy: 0.8230 - val_precision: 0.8645 - val_recall: 0.7613 Epoch 21/50 81/81 [==============================] - 25s 312ms/step - loss: 0.6334 - accuracy: 0.7328 - precision: 0.7501 - recall: 0.7085 - val_loss: 0.4977 - val_accuracy: 0.8642 - val_precision: 0.8839 - val_recall: 0.8148 Epoch 22/50 81/81 [==============================] - 25s 315ms/step - loss: 0.5775 - accuracy: 0.7946 - precision: 0.8069 - recall: 0.7615 - val_loss: 0.5218 - val_accuracy: 0.8642 - val_precision: 0.8991 - val_recall: 0.8066 Epoch 23/50 81/81 [==============================] - 25s 307ms/step - loss: 0.6131 - accuracy: 0.7541 - precision: 0.7715 - recall: 0.7151 - val_loss: 0.5076 - val_accuracy: 0.8971 - val_precision: 0.9120 - val_recall: 0.8107 Epoch 24/50 81/81 [==============================] - 25s 308ms/step - loss: 0.5641 - accuracy: 0.7608 - precision: 0.7915 - recall: 0.7342 - val_loss: 0.4847 - val_accuracy: 0.8889 - val_precision: 0.9123 - val_recall: 0.8560 Epoch 25/50 81/81 [==============================] - 25s 309ms/step - loss: 0.5459 - accuracy: 0.7907 - precision: 0.8188 - recall: 0.7513 - val_loss: 0.4978 - val_accuracy: 0.8765 - val_precision: 0.9120 - val_recall: 0.8107 Epoch 26/50 81/81 [==============================] - 25s 309ms/step - loss: 0.5595 - accuracy: 0.7536 - precision: 0.7892 - recall: 0.7360 - val_loss: 0.5585 - val_accuracy: 0.8436 - val_precision: 0.8894 - val_recall: 0.7613 Epoch 27/50 81/81 [==============================] - 25s 307ms/step - loss: 0.6200 - accuracy: 0.7475 - precision: 0.7764 - recall: 0.7098 - val_loss: 0.4736 - val_accuracy: 0.8930 - val_precision: 0.9107 - val_recall: 0.8395 Epoch 28/50 81/81 [==============================] - 25s 308ms/step - loss: 0.5101 - accuracy: 0.8186 - precision: 0.8299 - recall: 0.7892 - val_loss: 0.5011 - val_accuracy: 0.8642 - val_precision: 0.8977 - val_recall: 0.7942 Epoch 29/50 81/81 [==============================] - 25s 307ms/step - loss: 0.5963 - accuracy: 0.7810 - precision: 0.8075 - recall: 0.7467 - val_loss: 0.4466 - val_accuracy: 0.8848 - val_precision: 0.8898 - val_recall: 0.8642 Epoch 30/50 81/81 [==============================] - 25s 310ms/step - loss: 0.5237 - accuracy: 0.7847 - precision: 0.8081 - recall: 0.7539 - val_loss: 0.4636 - val_accuracy: 0.8765 - val_precision: 0.9043 - val_recall: 0.8560 Epoch 31/50 81/81 [==============================] - 25s 315ms/step - loss: 0.4759 - accuracy: 0.8122 - precision: 0.8435 - recall: 0.7962 - val_loss: 0.4272 - val_accuracy: 0.8807 - val_precision: 0.9156 - val_recall: 0.8477 Epoch 32/50 81/81 [==============================] - 25s 312ms/step - loss: 0.4936 - accuracy: 0.8009 - precision: 0.8276 - recall: 0.7675 - val_loss: 0.4189 - val_accuracy: 0.8930 - val_precision: 0.9207 - val_recall: 0.8601 Epoch 33/50 81/81 [==============================] - 25s 312ms/step - loss: 0.5063 - accuracy: 0.7923 - precision: 0.8183 - recall: 0.7770 - val_loss: 0.4944 - val_accuracy: 0.8724 - val_precision: 0.9206 - val_recall: 0.8107 Epoch 34/50 81/81 [==============================] - 25s 313ms/step - loss: 0.5653 - accuracy: 0.7816 - precision: 0.8109 - recall: 0.7540 - val_loss: 0.4236 - val_accuracy: 0.9012 - val_precision: 0.9361 - val_recall: 0.8436 Epoch 35/50 81/81 [==============================] - 25s 307ms/step - loss: 0.5176 - accuracy: 0.7951 - precision: 0.8282 - recall: 0.7567 - val_loss: 0.3895 - val_accuracy: 0.9095 - val_precision: 0.9214 - val_recall: 0.8683 Epoch 36/50 81/81 [==============================] - 25s 306ms/step - loss: 0.5331 - accuracy: 0.7975 - precision: 0.8195 - recall: 0.7745 - val_loss: 0.4030 - val_accuracy: 0.8889 - val_precision: 0.9163 - val_recall: 0.8560 Epoch 37/50 81/81 [==============================] - 25s 307ms/step - loss: 0.4840 - accuracy: 0.8224 - precision: 0.8361 - recall: 0.7940 - val_loss: 0.4163 - val_accuracy: 0.8889 - val_precision: 0.9189 - val_recall: 0.8395 Epoch 38/50 81/81 [==============================] - 25s 308ms/step - loss: 0.4396 - accuracy: 0.8275 - precision: 0.8447 - recall: 0.8003 - val_loss: 0.3789 - val_accuracy: 0.8971 - val_precision: 0.9254 - val_recall: 0.8683 Epoch 39/50 81/81 [==============================] - 25s 306ms/step - loss: 0.4556 - accuracy: 0.8216 - precision: 0.8304 - recall: 0.7824 - val_loss: 0.3905 - val_accuracy: 0.9053 - val_precision: 0.9292 - val_recall: 0.8642 Epoch 40/50 81/81 [==============================] - 25s 309ms/step - loss: 0.4785 - accuracy: 0.8274 - precision: 0.8433 - recall: 0.7798 - val_loss: 0.4069 - val_accuracy: 0.9053 - val_precision: 0.9136 - val_recall: 0.8272 Epoch 41/50 81/81 [==============================] - 25s 308ms/step - loss: 0.5144 - accuracy: 0.8111 - precision: 0.8244 - recall: 0.7836 - val_loss: 0.4384 - val_accuracy: 0.8642 - val_precision: 0.8947 - val_recall: 0.8395 Epoch 42/50 81/81 [==============================] - 25s 307ms/step - loss: 0.4560 - accuracy: 0.8084 - precision: 0.8454 - recall: 0.7813 - val_loss: 0.3857 - val_accuracy: 0.9053 - val_precision: 0.9251 - val_recall: 0.8642 Epoch 43/50 81/81 [==============================] - 25s 309ms/step - loss: 0.4852 - accuracy: 0.8167 - precision: 0.8349 - recall: 0.7807 - val_loss: 0.4207 - val_accuracy: 0.8601 - val_precision: 0.9037 - val_recall: 0.8107 Epoch 44/50 81/81 [==============================] - 25s 309ms/step - loss: 0.4963 - accuracy: 0.7961 - precision: 0.8196 - recall: 0.7800 - val_loss: 0.4131 - val_accuracy: 0.8889 - val_precision: 0.9414 - val_recall: 0.8601 Epoch 45/50 81/81 [==============================] - 25s 309ms/step - loss: 0.4636 - accuracy: 0.8386 - precision: 0.8601 - recall: 0.8096 - val_loss: 0.4246 - val_accuracy: 0.8765 - val_precision: 0.9324 - val_recall: 0.8519 Epoch 46/50 81/81 [==============================] - 25s 307ms/step - loss: 0.5167 - accuracy: 0.7905 - precision: 0.8044 - recall: 0.7579 - val_loss: 0.3938 - val_accuracy: 0.8889 - val_precision: 0.9220 - val_recall: 0.8272 Epoch 47/50 81/81 [==============================] - 25s 312ms/step - loss: 0.4437 - accuracy: 0.8392 - precision: 0.8564 - recall: 0.7949 - val_loss: 0.3715 - val_accuracy: 0.9218 - val_precision: 0.9505 - val_recall: 0.8683 Epoch 48/50 81/81 [==============================] - 26s 317ms/step - loss: 0.4923 - accuracy: 0.7610 - precision: 0.8076 - recall: 0.7348 - val_loss: 0.3913 - val_accuracy: 0.9012 - val_precision: 0.9367 - val_recall: 0.8519 Epoch 49/50 81/81 [==============================] - 26s 319ms/step - loss: 0.4444 - accuracy: 0.8176 - precision: 0.8471 - recall: 0.7843 - val_loss: 0.3654 - val_accuracy: 0.9012 - val_precision: 0.9333 - val_recall: 0.8642 Epoch 50/50 81/81 [==============================] - 26s 317ms/step - loss: 0.5040 - accuracy: 0.8030 - precision: 0.8221 - recall: 0.7714 - val_loss: 0.3981 - val_accuracy: 0.8930 - val_precision: 0.9324 - val_recall: 0.8519
model_1.summary()
Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= batch_normalization (BatchNo (None, None, None, 3) 12 _________________________________________________________________ conv2d (Conv2D) (None, None, None, 16) 448 _________________________________________________________________ max_pooling2d (MaxPooling2D) (None, None, None, 16) 0 _________________________________________________________________ dropout (Dropout) (None, None, None, 16) 0 _________________________________________________________________ batch_normalization_1 (Batch (None, None, None, 16) 64 _________________________________________________________________ conv2d_1 (Conv2D) (None, None, None, 8) 1160 _________________________________________________________________ max_pooling2d_1 (MaxPooling2 (None, None, None, 8) 0 _________________________________________________________________ dropout_1 (Dropout) (None, None, None, 8) 0 _________________________________________________________________ batch_normalization_2 (Batch (None, None, None, 8) 32 _________________________________________________________________ conv2d_2 (Conv2D) (None, None, None, 4) 292 _________________________________________________________________ max_pooling2d_2 (MaxPooling2 (None, None, None, 4) 0 _________________________________________________________________ dropout_2 (Dropout) (None, None, None, 4) 0 _________________________________________________________________ flatten (Flatten) (None, None) 0 _________________________________________________________________ dense (Dense) (None, 5) 25 _________________________________________________________________ dense_1 (Dense) (None, 3) 18 ================================================================= Total params: 2,051 Trainable params: 1,997 Non-trainable params: 54 _________________________________________________________________
plt.subplots_adjust(right=1.95, left=.03)
plt.subplot(1,3,1)
plt.plot(fit_history_1.history['accuracy'])
plt.plot(fit_history_1.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(fit_history_1.history['precision'])
plt.plot(fit_history_1.history['val_precision'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
#plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,3)
plt.plot(fit_history_1.history['recall'])
plt.plot(fit_history_1.history['val_recall'])
plt.ylabel('Recall')
plt.xlabel('')
#plt.legend(['training','validation'], loc="lower right")
plt.show()
test_loss, test_acc, test_precision, test_recall = model_1.evaluate(testing_data)
print('%s %.2f' % ('validation_acc: ', test_acc*100.0 ))
print('%s %.2f' % ('validation_loss:', test_loss ))
print('%s %.2f' % ('validation_precision:', test_precision ))
print('%s %.2f' % ('validation_recall:', test_recall ))
27/27 [==============================] - 6s 208ms/step - loss: 0.5172 - accuracy: 0.8477 - precision: 0.8756 - recall: 0.7819 validation_acc: 84.77 validation_loss: 0.52 validation_precision: 0.88 validation_recall: 0.78
Our self-built model has been able to react ~92% accuracy on the test data. It may be possible for a pre-trained network to perform better. We will evaluate two existing network architectures, VGG16, ResNet50 and DenseNet201, and evaluate their performance on our data set.
from tensorflow.keras.applications import VGG16, DenseNet121, ResNet50
We will need to load our data in again but with the default image size of 224x224 so it can be used by both networks.
image_generator = ImageDataGenerator(rescale=1./255)
training_data = image_generator.flow_from_directory( 'bears/training', target_size=(224, 224), batch_size=9, class_mode='categorical')
validation_data = image_generator.flow_from_directory( 'bears/validation', target_size=(224, 224), batch_size=9, class_mode='categorical')
testing_data = image_generator.flow_from_directory( 'bears/test', target_size=(224, 224), batch_size=9, class_mode='categorical')
Found 727 images belonging to 3 classes. Found 243 images belonging to 3 classes. Found 243 images belonging to 3 classes.
VGG16
First, lets evaluate the VGG16 network architecture. We'll fine tune the last convolutional of the architecture by unfreezing its weights (but keep the other weights frozen) and then develop a fully-connected network to attach to the backend.
vgg_base = VGG16(weights='imagenet', include_top=False, input_shape=(224,224,3))
vgg_base.summary()
Model: "vgg16" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_4 (InputLayer) [(None, 224, 224, 3)] 0 _________________________________________________________________ block1_conv1 (Conv2D) (None, 224, 224, 64) 1792 _________________________________________________________________ block1_conv2 (Conv2D) (None, 224, 224, 64) 36928 _________________________________________________________________ block1_pool (MaxPooling2D) (None, 112, 112, 64) 0 _________________________________________________________________ block2_conv1 (Conv2D) (None, 112, 112, 128) 73856 _________________________________________________________________ block2_conv2 (Conv2D) (None, 112, 112, 128) 147584 _________________________________________________________________ block2_pool (MaxPooling2D) (None, 56, 56, 128) 0 _________________________________________________________________ block3_conv1 (Conv2D) (None, 56, 56, 256) 295168 _________________________________________________________________ block3_conv2 (Conv2D) (None, 56, 56, 256) 590080 _________________________________________________________________ block3_conv3 (Conv2D) (None, 56, 56, 256) 590080 _________________________________________________________________ block3_pool (MaxPooling2D) (None, 28, 28, 256) 0 _________________________________________________________________ block4_conv1 (Conv2D) (None, 28, 28, 512) 1180160 _________________________________________________________________ block4_conv2 (Conv2D) (None, 28, 28, 512) 2359808 _________________________________________________________________ block4_conv3 (Conv2D) (None, 28, 28, 512) 2359808 _________________________________________________________________ block4_pool (MaxPooling2D) (None, 14, 14, 512) 0 _________________________________________________________________ block5_conv1 (Conv2D) (None, 14, 14, 512) 2359808 _________________________________________________________________ block5_conv2 (Conv2D) (None, 14, 14, 512) 2359808 _________________________________________________________________ block5_conv3 (Conv2D) (None, 14, 14, 512) 2359808 _________________________________________________________________ block5_pool (MaxPooling2D) (None, 7, 7, 512) 0 ================================================================= Total params: 14,714,688 Trainable params: 14,714,688 Non-trainable params: 0 _________________________________________________________________
import re
vgg_base.trainable = False
for layer in vgg_base.layers:
if bool(re.search('block5',layer.name)):
layer.trainable = True
print(layer.name,": Trainable")
block5_conv1 : Trainable block5_conv2 : Trainable block5_conv3 : Trainable block5_pool : Trainable
modelV1 = Sequential()
modelV1.add( vgg_base )
modelV1.add( Flatten() )
modelV1.add( Dense(units=25, activation = 'relu' , input_dim = 7 * 7 * 512) )
modelV1.add( Dense(units=3, activation = 'softmax' ) )
modelV1.summary()
Model: "sequential_4" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= vgg16 (Functional) (None, 7, 7, 512) 14714688 _________________________________________________________________ flatten_4 (Flatten) (None, 25088) 0 _________________________________________________________________ dense_8 (Dense) (None, 25) 627225 _________________________________________________________________ dense_9 (Dense) (None, 3) 78 ================================================================= Total params: 15,341,991 Trainable params: 627,303 Non-trainable params: 14,714,688 _________________________________________________________________
modelV1.compile( optimizer = 'rmsprop', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
modelV1_history = modelV1.fit( training_data, validation_data = validation_data, epochs = 50, batch_size = 64, callbacks = [callback_earlystp] )
Epoch 1/50 81/81 [==============================] - 28s 335ms/step - loss: 1.9526 - accuracy: 0.6313 - precision_4: 0.6430 - recall_4: 0.6076 - val_loss: 0.2762 - val_accuracy: 0.8765 - val_precision_4: 0.8824 - val_recall_4: 0.8642 Epoch 2/50 81/81 [==============================] - 26s 326ms/step - loss: 0.2212 - accuracy: 0.9343 - precision_4: 0.9343 - recall_4: 0.9340 - val_loss: 0.2191 - val_accuracy: 0.9383 - val_precision_4: 0.9461 - val_recall_4: 0.9383 Epoch 3/50 81/81 [==============================] - 27s 329ms/step - loss: 0.1944 - accuracy: 0.9345 - precision_4: 0.9366 - recall_4: 0.9345 - val_loss: 0.2023 - val_accuracy: 0.9383 - val_precision_4: 0.9419 - val_recall_4: 0.9342 Epoch 4/50 81/81 [==============================] - 27s 334ms/step - loss: 0.0879 - accuracy: 0.9747 - precision_4: 0.9747 - recall_4: 0.9747 - val_loss: 0.2158 - val_accuracy: 0.9588 - val_precision_4: 0.9585 - val_recall_4: 0.9506 Epoch 5/50 81/81 [==============================] - 26s 323ms/step - loss: 0.0424 - accuracy: 0.9919 - precision_4: 0.9919 - recall_4: 0.9919 - val_loss: 2.1298 - val_accuracy: 0.7037 - val_precision_4: 0.7037 - val_recall_4: 0.7037 Epoch 6/50 81/81 [==============================] - 26s 326ms/step - loss: 0.1636 - accuracy: 0.9784 - precision_4: 0.9784 - recall_4: 0.9784 - val_loss: 0.2059 - val_accuracy: 0.9547 - val_precision_4: 0.9587 - val_recall_4: 0.9547 Epoch 7/50 81/81 [==============================] - 26s 321ms/step - loss: 0.0158 - accuracy: 0.9972 - precision_4: 0.9972 - recall_4: 0.9972 - val_loss: 0.3082 - val_accuracy: 0.9424 - val_precision_4: 0.9424 - val_recall_4: 0.9424 Epoch 8/50 81/81 [==============================] - 26s 325ms/step - loss: 0.0080 - accuracy: 0.9975 - precision_4: 0.9975 - recall_4: 0.9975 - val_loss: 0.2897 - val_accuracy: 0.9465 - val_precision_4: 0.9465 - val_recall_4: 0.9465 Epoch 9/50 81/81 [==============================] - 26s 325ms/step - loss: 0.0178 - accuracy: 0.9950 - precision_4: 0.9950 - recall_4: 0.9950 - val_loss: 0.2546 - val_accuracy: 0.9506 - val_precision_4: 0.9506 - val_recall_4: 0.9506 Epoch 10/50 81/81 [==============================] - 26s 327ms/step - loss: 0.0171 - accuracy: 0.9958 - precision_4: 0.9958 - recall_4: 0.9958 - val_loss: 0.2639 - val_accuracy: 0.9630 - val_precision_4: 0.9630 - val_recall_4: 0.9630 Epoch 11/50 81/81 [==============================] - 26s 323ms/step - loss: 8.7138e-05 - accuracy: 1.0000 - precision_4: 1.0000 - recall_4: 1.0000 - val_loss: 0.2558 - val_accuracy: 0.9547 - val_precision_4: 0.9547 - val_recall_4: 0.9547 Epoch 12/50 81/81 [==============================] - 27s 327ms/step - loss: 0.0221 - accuracy: 0.9950 - precision_4: 0.9950 - recall_4: 0.9950 - val_loss: 0.2647 - val_accuracy: 0.9547 - val_precision_4: 0.9545 - val_recall_4: 0.9506 Epoch 13/50 81/81 [==============================] - 26s 328ms/step - loss: 1.1333e-05 - accuracy: 1.0000 - precision_4: 1.0000 - recall_4: 1.0000 - val_loss: 0.2967 - val_accuracy: 0.9547 - val_precision_4: 0.9587 - val_recall_4: 0.9547 Epoch 14/50 81/81 [==============================] - 27s 330ms/step - loss: 0.0041 - accuracy: 0.9990 - precision_4: 0.9990 - recall_4: 0.9990 - val_loss: 0.3014 - val_accuracy: 0.9547 - val_precision_4: 0.9547 - val_recall_4: 0.9547 Epoch 15/50 81/81 [==============================] - 27s 337ms/step - loss: 4.9826e-06 - accuracy: 1.0000 - precision_4: 1.0000 - recall_4: 1.0000 - val_loss: 0.2961 - val_accuracy: 0.9588 - val_precision_4: 0.9588 - val_recall_4: 0.9588 Epoch 16/50 81/81 [==============================] - 27s 336ms/step - loss: 0.0129 - accuracy: 0.9977 - precision_4: 0.9977 - recall_4: 0.9977 - val_loss: 0.4053 - val_accuracy: 0.9547 - val_precision_4: 0.9547 - val_recall_4: 0.9547 Epoch 17/50 81/81 [==============================] - 27s 332ms/step - loss: 2.1034e-06 - accuracy: 1.0000 - precision_4: 1.0000 - recall_4: 1.0000 - val_loss: 0.3261 - val_accuracy: 0.9547 - val_precision_4: 0.9587 - val_recall_4: 0.9547 Epoch 18/50 81/81 [==============================] - 27s 331ms/step - loss: 5.7038e-05 - accuracy: 1.0000 - precision_4: 1.0000 - recall_4: 1.0000 - val_loss: 0.5040 - val_accuracy: 0.9300 - val_precision_4: 0.9298 - val_recall_4: 0.9259 Epoch 00018: early stopping
plt.subplots_adjust(right=1.95, left=.03)
plt.subplot(1,3,1)
plt.plot(modelV1_history.history['accuracy'])
plt.plot(modelV1_history.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(modelV1_history.history['precision_4'])
plt.plot(modelV1_history.history['val_precision_4'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
plt.subplot(1,3,3)
plt.plot(modelV1_history.history['recall_4'])
plt.plot(modelV1_history.history['val_recall_4'])
plt.ylabel('Recall')
plt.xlabel('')
#plt.subplot(2,2,4)
#plt.plot(modelV1_history.history['loss'])
#plt.plot(modelV1_history.history['val_loss'])
#plt.ylabel('Loss')
#plt.xlabel('Epoch')
plt.show()
test_loss, test_acc, test_precision, test_recall = modelV1.evaluate(testing_data)
print('%s %.2f' % ('validation_acc: ', test_acc*100.0 ))
print('%s %.2f' % ('validation_loss:', test_loss ))
print('%s %.2f' % ('validation_precision:', test_precision ))
print('%s %.2f' % ('validation_recall:', test_recall ))
27/27 [==============================] - 6s 225ms/step - loss: 0.3475 - accuracy: 0.9300 - precision_4: 0.9300 - recall_4: 0.9300 validation_acc: 93.00 validation_loss: 0.35 validation_precision: 0.93 validation_recall: 0.93
DenseNet121
We will now try a more recent architecture, DenseNet121. DenseNet121 was chosen as it has equitable performance with the other DenseNet architectures available in Keras while having the fewest parameters. Like the VGG16 network, we will unfreeze the weights of the last convolutional layer. Then, we will attach a similar fully-connected network to the VGG and evaluate the performance.
dense_base = DenseNet121(weights='imagenet', include_top=False, input_shape=(224,224,3))
dense_base.summary()
Model: "densenet121"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_3 (InputLayer) [(None, 224, 224, 3) 0
__________________________________________________________________________________________________
zero_padding2d_2 (ZeroPadding2D (None, 230, 230, 3) 0 input_3[0][0]
__________________________________________________________________________________________________
conv1/conv (Conv2D) (None, 112, 112, 64) 9408 zero_padding2d_2[0][0]
__________________________________________________________________________________________________
conv1/bn (BatchNormalization) (None, 112, 112, 64) 256 conv1/conv[0][0]
__________________________________________________________________________________________________
conv1/relu (Activation) (None, 112, 112, 64) 0 conv1/bn[0][0]
__________________________________________________________________________________________________
zero_padding2d_3 (ZeroPadding2D (None, 114, 114, 64) 0 conv1/relu[0][0]
__________________________________________________________________________________________________
pool1 (MaxPooling2D) (None, 56, 56, 64) 0 zero_padding2d_3[0][0]
__________________________________________________________________________________________________
conv2_block1_0_bn (BatchNormali (None, 56, 56, 64) 256 pool1[0][0]
__________________________________________________________________________________________________
conv2_block1_0_relu (Activation (None, 56, 56, 64) 0 conv2_block1_0_bn[0][0]
__________________________________________________________________________________________________
conv2_block1_1_conv (Conv2D) (None, 56, 56, 128) 8192 conv2_block1_0_relu[0][0]
__________________________________________________________________________________________________
conv2_block1_1_bn (BatchNormali (None, 56, 56, 128) 512 conv2_block1_1_conv[0][0]
__________________________________________________________________________________________________
conv2_block1_1_relu (Activation (None, 56, 56, 128) 0 conv2_block1_1_bn[0][0]
__________________________________________________________________________________________________
conv2_block1_2_conv (Conv2D) (None, 56, 56, 32) 36864 conv2_block1_1_relu[0][0]
__________________________________________________________________________________________________
conv2_block1_concat (Concatenat (None, 56, 56, 96) 0 pool1[0][0]
conv2_block1_2_conv[0][0]
__________________________________________________________________________________________________
conv2_block2_0_bn (BatchNormali (None, 56, 56, 96) 384 conv2_block1_concat[0][0]
__________________________________________________________________________________________________
conv2_block2_0_relu (Activation (None, 56, 56, 96) 0 conv2_block2_0_bn[0][0]
__________________________________________________________________________________________________
conv2_block2_1_conv (Conv2D) (None, 56, 56, 128) 12288 conv2_block2_0_relu[0][0]
__________________________________________________________________________________________________
conv2_block2_1_bn (BatchNormali (None, 56, 56, 128) 512 conv2_block2_1_conv[0][0]
__________________________________________________________________________________________________
conv2_block2_1_relu (Activation (None, 56, 56, 128) 0 conv2_block2_1_bn[0][0]
__________________________________________________________________________________________________
conv2_block2_2_conv (Conv2D) (None, 56, 56, 32) 36864 conv2_block2_1_relu[0][0]
__________________________________________________________________________________________________
conv2_block2_concat (Concatenat (None, 56, 56, 128) 0 conv2_block1_concat[0][0]
conv2_block2_2_conv[0][0]
__________________________________________________________________________________________________
conv2_block3_0_bn (BatchNormali (None, 56, 56, 128) 512 conv2_block2_concat[0][0]
__________________________________________________________________________________________________
conv2_block3_0_relu (Activation (None, 56, 56, 128) 0 conv2_block3_0_bn[0][0]
__________________________________________________________________________________________________
conv2_block3_1_conv (Conv2D) (None, 56, 56, 128) 16384 conv2_block3_0_relu[0][0]
__________________________________________________________________________________________________
conv2_block3_1_bn (BatchNormali (None, 56, 56, 128) 512 conv2_block3_1_conv[0][0]
__________________________________________________________________________________________________
conv2_block3_1_relu (Activation (None, 56, 56, 128) 0 conv2_block3_1_bn[0][0]
__________________________________________________________________________________________________
conv2_block3_2_conv (Conv2D) (None, 56, 56, 32) 36864 conv2_block3_1_relu[0][0]
__________________________________________________________________________________________________
conv2_block3_concat (Concatenat (None, 56, 56, 160) 0 conv2_block2_concat[0][0]
conv2_block3_2_conv[0][0]
__________________________________________________________________________________________________
conv2_block4_0_bn (BatchNormali (None, 56, 56, 160) 640 conv2_block3_concat[0][0]
__________________________________________________________________________________________________
conv2_block4_0_relu (Activation (None, 56, 56, 160) 0 conv2_block4_0_bn[0][0]
__________________________________________________________________________________________________
conv2_block4_1_conv (Conv2D) (None, 56, 56, 128) 20480 conv2_block4_0_relu[0][0]
__________________________________________________________________________________________________
conv2_block4_1_bn (BatchNormali (None, 56, 56, 128) 512 conv2_block4_1_conv[0][0]
__________________________________________________________________________________________________
conv2_block4_1_relu (Activation (None, 56, 56, 128) 0 conv2_block4_1_bn[0][0]
__________________________________________________________________________________________________
conv2_block4_2_conv (Conv2D) (None, 56, 56, 32) 36864 conv2_block4_1_relu[0][0]
__________________________________________________________________________________________________
conv2_block4_concat (Concatenat (None, 56, 56, 192) 0 conv2_block3_concat[0][0]
conv2_block4_2_conv[0][0]
__________________________________________________________________________________________________
conv2_block5_0_bn (BatchNormali (None, 56, 56, 192) 768 conv2_block4_concat[0][0]
__________________________________________________________________________________________________
conv2_block5_0_relu (Activation (None, 56, 56, 192) 0 conv2_block5_0_bn[0][0]
__________________________________________________________________________________________________
conv2_block5_1_conv (Conv2D) (None, 56, 56, 128) 24576 conv2_block5_0_relu[0][0]
__________________________________________________________________________________________________
conv2_block5_1_bn (BatchNormali (None, 56, 56, 128) 512 conv2_block5_1_conv[0][0]
__________________________________________________________________________________________________
conv2_block5_1_relu (Activation (None, 56, 56, 128) 0 conv2_block5_1_bn[0][0]
__________________________________________________________________________________________________
conv2_block5_2_conv (Conv2D) (None, 56, 56, 32) 36864 conv2_block5_1_relu[0][0]
__________________________________________________________________________________________________
conv2_block5_concat (Concatenat (None, 56, 56, 224) 0 conv2_block4_concat[0][0]
conv2_block5_2_conv[0][0]
__________________________________________________________________________________________________
conv2_block6_0_bn (BatchNormali (None, 56, 56, 224) 896 conv2_block5_concat[0][0]
__________________________________________________________________________________________________
conv2_block6_0_relu (Activation (None, 56, 56, 224) 0 conv2_block6_0_bn[0][0]
__________________________________________________________________________________________________
conv2_block6_1_conv (Conv2D) (None, 56, 56, 128) 28672 conv2_block6_0_relu[0][0]
__________________________________________________________________________________________________
conv2_block6_1_bn (BatchNormali (None, 56, 56, 128) 512 conv2_block6_1_conv[0][0]
__________________________________________________________________________________________________
conv2_block6_1_relu (Activation (None, 56, 56, 128) 0 conv2_block6_1_bn[0][0]
__________________________________________________________________________________________________
conv2_block6_2_conv (Conv2D) (None, 56, 56, 32) 36864 conv2_block6_1_relu[0][0]
__________________________________________________________________________________________________
conv2_block6_concat (Concatenat (None, 56, 56, 256) 0 conv2_block5_concat[0][0]
conv2_block6_2_conv[0][0]
__________________________________________________________________________________________________
pool2_bn (BatchNormalization) (None, 56, 56, 256) 1024 conv2_block6_concat[0][0]
__________________________________________________________________________________________________
pool2_relu (Activation) (None, 56, 56, 256) 0 pool2_bn[0][0]
__________________________________________________________________________________________________
pool2_conv (Conv2D) (None, 56, 56, 128) 32768 pool2_relu[0][0]
__________________________________________________________________________________________________
pool2_pool (AveragePooling2D) (None, 28, 28, 128) 0 pool2_conv[0][0]
__________________________________________________________________________________________________
conv3_block1_0_bn (BatchNormali (None, 28, 28, 128) 512 pool2_pool[0][0]
__________________________________________________________________________________________________
conv3_block1_0_relu (Activation (None, 28, 28, 128) 0 conv3_block1_0_bn[0][0]
__________________________________________________________________________________________________
conv3_block1_1_conv (Conv2D) (None, 28, 28, 128) 16384 conv3_block1_0_relu[0][0]
__________________________________________________________________________________________________
conv3_block1_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block1_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block1_1_relu (Activation (None, 28, 28, 128) 0 conv3_block1_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block1_2_conv (Conv2D) (None, 28, 28, 32) 36864 conv3_block1_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block1_concat (Concatenat (None, 28, 28, 160) 0 pool2_pool[0][0]
conv3_block1_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block2_0_bn (BatchNormali (None, 28, 28, 160) 640 conv3_block1_concat[0][0]
__________________________________________________________________________________________________
conv3_block2_0_relu (Activation (None, 28, 28, 160) 0 conv3_block2_0_bn[0][0]
__________________________________________________________________________________________________
conv3_block2_1_conv (Conv2D) (None, 28, 28, 128) 20480 conv3_block2_0_relu[0][0]
__________________________________________________________________________________________________
conv3_block2_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block2_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block2_1_relu (Activation (None, 28, 28, 128) 0 conv3_block2_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block2_2_conv (Conv2D) (None, 28, 28, 32) 36864 conv3_block2_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block2_concat (Concatenat (None, 28, 28, 192) 0 conv3_block1_concat[0][0]
conv3_block2_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block3_0_bn (BatchNormali (None, 28, 28, 192) 768 conv3_block2_concat[0][0]
__________________________________________________________________________________________________
conv3_block3_0_relu (Activation (None, 28, 28, 192) 0 conv3_block3_0_bn[0][0]
__________________________________________________________________________________________________
conv3_block3_1_conv (Conv2D) (None, 28, 28, 128) 24576 conv3_block3_0_relu[0][0]
__________________________________________________________________________________________________
conv3_block3_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block3_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block3_1_relu (Activation (None, 28, 28, 128) 0 conv3_block3_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block3_2_conv (Conv2D) (None, 28, 28, 32) 36864 conv3_block3_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block3_concat (Concatenat (None, 28, 28, 224) 0 conv3_block2_concat[0][0]
conv3_block3_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block4_0_bn (BatchNormali (None, 28, 28, 224) 896 conv3_block3_concat[0][0]
__________________________________________________________________________________________________
conv3_block4_0_relu (Activation (None, 28, 28, 224) 0 conv3_block4_0_bn[0][0]
__________________________________________________________________________________________________
conv3_block4_1_conv (Conv2D) (None, 28, 28, 128) 28672 conv3_block4_0_relu[0][0]
__________________________________________________________________________________________________
conv3_block4_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block4_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block4_1_relu (Activation (None, 28, 28, 128) 0 conv3_block4_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block4_2_conv (Conv2D) (None, 28, 28, 32) 36864 conv3_block4_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block4_concat (Concatenat (None, 28, 28, 256) 0 conv3_block3_concat[0][0]
conv3_block4_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block5_0_bn (BatchNormali (None, 28, 28, 256) 1024 conv3_block4_concat[0][0]
__________________________________________________________________________________________________
conv3_block5_0_relu (Activation (None, 28, 28, 256) 0 conv3_block5_0_bn[0][0]
__________________________________________________________________________________________________
conv3_block5_1_conv (Conv2D) (None, 28, 28, 128) 32768 conv3_block5_0_relu[0][0]
__________________________________________________________________________________________________
conv3_block5_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block5_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block5_1_relu (Activation (None, 28, 28, 128) 0 conv3_block5_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block5_2_conv (Conv2D) (None, 28, 28, 32) 36864 conv3_block5_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block5_concat (Concatenat (None, 28, 28, 288) 0 conv3_block4_concat[0][0]
conv3_block5_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block6_0_bn (BatchNormali (None, 28, 28, 288) 1152 conv3_block5_concat[0][0]
__________________________________________________________________________________________________
conv3_block6_0_relu (Activation (None, 28, 28, 288) 0 conv3_block6_0_bn[0][0]
__________________________________________________________________________________________________
conv3_block6_1_conv (Conv2D) (None, 28, 28, 128) 36864 conv3_block6_0_relu[0][0]
__________________________________________________________________________________________________
conv3_block6_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block6_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block6_1_relu (Activation (None, 28, 28, 128) 0 conv3_block6_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block6_2_conv (Conv2D) (None, 28, 28, 32) 36864 conv3_block6_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block6_concat (Concatenat (None, 28, 28, 320) 0 conv3_block5_concat[0][0]
conv3_block6_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block7_0_bn (BatchNormali (None, 28, 28, 320) 1280 conv3_block6_concat[0][0]
__________________________________________________________________________________________________
conv3_block7_0_relu (Activation (None, 28, 28, 320) 0 conv3_block7_0_bn[0][0]
__________________________________________________________________________________________________
conv3_block7_1_conv (Conv2D) (None, 28, 28, 128) 40960 conv3_block7_0_relu[0][0]
__________________________________________________________________________________________________
conv3_block7_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block7_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block7_1_relu (Activation (None, 28, 28, 128) 0 conv3_block7_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block7_2_conv (Conv2D) (None, 28, 28, 32) 36864 conv3_block7_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block7_concat (Concatenat (None, 28, 28, 352) 0 conv3_block6_concat[0][0]
conv3_block7_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block8_0_bn (BatchNormali (None, 28, 28, 352) 1408 conv3_block7_concat[0][0]
__________________________________________________________________________________________________
conv3_block8_0_relu (Activation (None, 28, 28, 352) 0 conv3_block8_0_bn[0][0]
__________________________________________________________________________________________________
conv3_block8_1_conv (Conv2D) (None, 28, 28, 128) 45056 conv3_block8_0_relu[0][0]
__________________________________________________________________________________________________
conv3_block8_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block8_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block8_1_relu (Activation (None, 28, 28, 128) 0 conv3_block8_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block8_2_conv (Conv2D) (None, 28, 28, 32) 36864 conv3_block8_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block8_concat (Concatenat (None, 28, 28, 384) 0 conv3_block7_concat[0][0]
conv3_block8_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block9_0_bn (BatchNormali (None, 28, 28, 384) 1536 conv3_block8_concat[0][0]
__________________________________________________________________________________________________
conv3_block9_0_relu (Activation (None, 28, 28, 384) 0 conv3_block9_0_bn[0][0]
__________________________________________________________________________________________________
conv3_block9_1_conv (Conv2D) (None, 28, 28, 128) 49152 conv3_block9_0_relu[0][0]
__________________________________________________________________________________________________
conv3_block9_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block9_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block9_1_relu (Activation (None, 28, 28, 128) 0 conv3_block9_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block9_2_conv (Conv2D) (None, 28, 28, 32) 36864 conv3_block9_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block9_concat (Concatenat (None, 28, 28, 416) 0 conv3_block8_concat[0][0]
conv3_block9_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block10_0_bn (BatchNormal (None, 28, 28, 416) 1664 conv3_block9_concat[0][0]
__________________________________________________________________________________________________
conv3_block10_0_relu (Activatio (None, 28, 28, 416) 0 conv3_block10_0_bn[0][0]
__________________________________________________________________________________________________
conv3_block10_1_conv (Conv2D) (None, 28, 28, 128) 53248 conv3_block10_0_relu[0][0]
__________________________________________________________________________________________________
conv3_block10_1_bn (BatchNormal (None, 28, 28, 128) 512 conv3_block10_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block10_1_relu (Activatio (None, 28, 28, 128) 0 conv3_block10_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block10_2_conv (Conv2D) (None, 28, 28, 32) 36864 conv3_block10_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block10_concat (Concatena (None, 28, 28, 448) 0 conv3_block9_concat[0][0]
conv3_block10_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block11_0_bn (BatchNormal (None, 28, 28, 448) 1792 conv3_block10_concat[0][0]
__________________________________________________________________________________________________
conv3_block11_0_relu (Activatio (None, 28, 28, 448) 0 conv3_block11_0_bn[0][0]
__________________________________________________________________________________________________
conv3_block11_1_conv (Conv2D) (None, 28, 28, 128) 57344 conv3_block11_0_relu[0][0]
__________________________________________________________________________________________________
conv3_block11_1_bn (BatchNormal (None, 28, 28, 128) 512 conv3_block11_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block11_1_relu (Activatio (None, 28, 28, 128) 0 conv3_block11_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block11_2_conv (Conv2D) (None, 28, 28, 32) 36864 conv3_block11_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block11_concat (Concatena (None, 28, 28, 480) 0 conv3_block10_concat[0][0]
conv3_block11_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block12_0_bn (BatchNormal (None, 28, 28, 480) 1920 conv3_block11_concat[0][0]
__________________________________________________________________________________________________
conv3_block12_0_relu (Activatio (None, 28, 28, 480) 0 conv3_block12_0_bn[0][0]
__________________________________________________________________________________________________
conv3_block12_1_conv (Conv2D) (None, 28, 28, 128) 61440 conv3_block12_0_relu[0][0]
__________________________________________________________________________________________________
conv3_block12_1_bn (BatchNormal (None, 28, 28, 128) 512 conv3_block12_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block12_1_relu (Activatio (None, 28, 28, 128) 0 conv3_block12_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block12_2_conv (Conv2D) (None, 28, 28, 32) 36864 conv3_block12_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block12_concat (Concatena (None, 28, 28, 512) 0 conv3_block11_concat[0][0]
conv3_block12_2_conv[0][0]
__________________________________________________________________________________________________
pool3_bn (BatchNormalization) (None, 28, 28, 512) 2048 conv3_block12_concat[0][0]
__________________________________________________________________________________________________
pool3_relu (Activation) (None, 28, 28, 512) 0 pool3_bn[0][0]
__________________________________________________________________________________________________
pool3_conv (Conv2D) (None, 28, 28, 256) 131072 pool3_relu[0][0]
__________________________________________________________________________________________________
pool3_pool (AveragePooling2D) (None, 14, 14, 256) 0 pool3_conv[0][0]
__________________________________________________________________________________________________
conv4_block1_0_bn (BatchNormali (None, 14, 14, 256) 1024 pool3_pool[0][0]
__________________________________________________________________________________________________
conv4_block1_0_relu (Activation (None, 14, 14, 256) 0 conv4_block1_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block1_1_conv (Conv2D) (None, 14, 14, 128) 32768 conv4_block1_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block1_1_bn (BatchNormali (None, 14, 14, 128) 512 conv4_block1_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block1_1_relu (Activation (None, 14, 14, 128) 0 conv4_block1_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block1_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block1_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block1_concat (Concatenat (None, 14, 14, 288) 0 pool3_pool[0][0]
conv4_block1_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block2_0_bn (BatchNormali (None, 14, 14, 288) 1152 conv4_block1_concat[0][0]
__________________________________________________________________________________________________
conv4_block2_0_relu (Activation (None, 14, 14, 288) 0 conv4_block2_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block2_1_conv (Conv2D) (None, 14, 14, 128) 36864 conv4_block2_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block2_1_bn (BatchNormali (None, 14, 14, 128) 512 conv4_block2_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block2_1_relu (Activation (None, 14, 14, 128) 0 conv4_block2_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block2_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block2_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block2_concat (Concatenat (None, 14, 14, 320) 0 conv4_block1_concat[0][0]
conv4_block2_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block3_0_bn (BatchNormali (None, 14, 14, 320) 1280 conv4_block2_concat[0][0]
__________________________________________________________________________________________________
conv4_block3_0_relu (Activation (None, 14, 14, 320) 0 conv4_block3_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block3_1_conv (Conv2D) (None, 14, 14, 128) 40960 conv4_block3_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block3_1_bn (BatchNormali (None, 14, 14, 128) 512 conv4_block3_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block3_1_relu (Activation (None, 14, 14, 128) 0 conv4_block3_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block3_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block3_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block3_concat (Concatenat (None, 14, 14, 352) 0 conv4_block2_concat[0][0]
conv4_block3_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block4_0_bn (BatchNormali (None, 14, 14, 352) 1408 conv4_block3_concat[0][0]
__________________________________________________________________________________________________
conv4_block4_0_relu (Activation (None, 14, 14, 352) 0 conv4_block4_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block4_1_conv (Conv2D) (None, 14, 14, 128) 45056 conv4_block4_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block4_1_bn (BatchNormali (None, 14, 14, 128) 512 conv4_block4_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block4_1_relu (Activation (None, 14, 14, 128) 0 conv4_block4_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block4_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block4_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block4_concat (Concatenat (None, 14, 14, 384) 0 conv4_block3_concat[0][0]
conv4_block4_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block5_0_bn (BatchNormali (None, 14, 14, 384) 1536 conv4_block4_concat[0][0]
__________________________________________________________________________________________________
conv4_block5_0_relu (Activation (None, 14, 14, 384) 0 conv4_block5_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block5_1_conv (Conv2D) (None, 14, 14, 128) 49152 conv4_block5_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block5_1_bn (BatchNormali (None, 14, 14, 128) 512 conv4_block5_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block5_1_relu (Activation (None, 14, 14, 128) 0 conv4_block5_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block5_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block5_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block5_concat (Concatenat (None, 14, 14, 416) 0 conv4_block4_concat[0][0]
conv4_block5_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block6_0_bn (BatchNormali (None, 14, 14, 416) 1664 conv4_block5_concat[0][0]
__________________________________________________________________________________________________
conv4_block6_0_relu (Activation (None, 14, 14, 416) 0 conv4_block6_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block6_1_conv (Conv2D) (None, 14, 14, 128) 53248 conv4_block6_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block6_1_bn (BatchNormali (None, 14, 14, 128) 512 conv4_block6_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block6_1_relu (Activation (None, 14, 14, 128) 0 conv4_block6_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block6_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block6_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block6_concat (Concatenat (None, 14, 14, 448) 0 conv4_block5_concat[0][0]
conv4_block6_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block7_0_bn (BatchNormali (None, 14, 14, 448) 1792 conv4_block6_concat[0][0]
__________________________________________________________________________________________________
conv4_block7_0_relu (Activation (None, 14, 14, 448) 0 conv4_block7_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block7_1_conv (Conv2D) (None, 14, 14, 128) 57344 conv4_block7_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block7_1_bn (BatchNormali (None, 14, 14, 128) 512 conv4_block7_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block7_1_relu (Activation (None, 14, 14, 128) 0 conv4_block7_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block7_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block7_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block7_concat (Concatenat (None, 14, 14, 480) 0 conv4_block6_concat[0][0]
conv4_block7_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block8_0_bn (BatchNormali (None, 14, 14, 480) 1920 conv4_block7_concat[0][0]
__________________________________________________________________________________________________
conv4_block8_0_relu (Activation (None, 14, 14, 480) 0 conv4_block8_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block8_1_conv (Conv2D) (None, 14, 14, 128) 61440 conv4_block8_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block8_1_bn (BatchNormali (None, 14, 14, 128) 512 conv4_block8_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block8_1_relu (Activation (None, 14, 14, 128) 0 conv4_block8_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block8_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block8_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block8_concat (Concatenat (None, 14, 14, 512) 0 conv4_block7_concat[0][0]
conv4_block8_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block9_0_bn (BatchNormali (None, 14, 14, 512) 2048 conv4_block8_concat[0][0]
__________________________________________________________________________________________________
conv4_block9_0_relu (Activation (None, 14, 14, 512) 0 conv4_block9_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block9_1_conv (Conv2D) (None, 14, 14, 128) 65536 conv4_block9_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block9_1_bn (BatchNormali (None, 14, 14, 128) 512 conv4_block9_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block9_1_relu (Activation (None, 14, 14, 128) 0 conv4_block9_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block9_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block9_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block9_concat (Concatenat (None, 14, 14, 544) 0 conv4_block8_concat[0][0]
conv4_block9_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block10_0_bn (BatchNormal (None, 14, 14, 544) 2176 conv4_block9_concat[0][0]
__________________________________________________________________________________________________
conv4_block10_0_relu (Activatio (None, 14, 14, 544) 0 conv4_block10_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block10_1_conv (Conv2D) (None, 14, 14, 128) 69632 conv4_block10_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block10_1_bn (BatchNormal (None, 14, 14, 128) 512 conv4_block10_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block10_1_relu (Activatio (None, 14, 14, 128) 0 conv4_block10_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block10_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block10_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block10_concat (Concatena (None, 14, 14, 576) 0 conv4_block9_concat[0][0]
conv4_block10_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block11_0_bn (BatchNormal (None, 14, 14, 576) 2304 conv4_block10_concat[0][0]
__________________________________________________________________________________________________
conv4_block11_0_relu (Activatio (None, 14, 14, 576) 0 conv4_block11_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block11_1_conv (Conv2D) (None, 14, 14, 128) 73728 conv4_block11_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block11_1_bn (BatchNormal (None, 14, 14, 128) 512 conv4_block11_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block11_1_relu (Activatio (None, 14, 14, 128) 0 conv4_block11_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block11_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block11_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block11_concat (Concatena (None, 14, 14, 608) 0 conv4_block10_concat[0][0]
conv4_block11_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block12_0_bn (BatchNormal (None, 14, 14, 608) 2432 conv4_block11_concat[0][0]
__________________________________________________________________________________________________
conv4_block12_0_relu (Activatio (None, 14, 14, 608) 0 conv4_block12_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block12_1_conv (Conv2D) (None, 14, 14, 128) 77824 conv4_block12_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block12_1_bn (BatchNormal (None, 14, 14, 128) 512 conv4_block12_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block12_1_relu (Activatio (None, 14, 14, 128) 0 conv4_block12_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block12_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block12_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block12_concat (Concatena (None, 14, 14, 640) 0 conv4_block11_concat[0][0]
conv4_block12_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block13_0_bn (BatchNormal (None, 14, 14, 640) 2560 conv4_block12_concat[0][0]
__________________________________________________________________________________________________
conv4_block13_0_relu (Activatio (None, 14, 14, 640) 0 conv4_block13_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block13_1_conv (Conv2D) (None, 14, 14, 128) 81920 conv4_block13_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block13_1_bn (BatchNormal (None, 14, 14, 128) 512 conv4_block13_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block13_1_relu (Activatio (None, 14, 14, 128) 0 conv4_block13_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block13_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block13_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block13_concat (Concatena (None, 14, 14, 672) 0 conv4_block12_concat[0][0]
conv4_block13_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block14_0_bn (BatchNormal (None, 14, 14, 672) 2688 conv4_block13_concat[0][0]
__________________________________________________________________________________________________
conv4_block14_0_relu (Activatio (None, 14, 14, 672) 0 conv4_block14_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block14_1_conv (Conv2D) (None, 14, 14, 128) 86016 conv4_block14_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block14_1_bn (BatchNormal (None, 14, 14, 128) 512 conv4_block14_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block14_1_relu (Activatio (None, 14, 14, 128) 0 conv4_block14_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block14_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block14_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block14_concat (Concatena (None, 14, 14, 704) 0 conv4_block13_concat[0][0]
conv4_block14_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block15_0_bn (BatchNormal (None, 14, 14, 704) 2816 conv4_block14_concat[0][0]
__________________________________________________________________________________________________
conv4_block15_0_relu (Activatio (None, 14, 14, 704) 0 conv4_block15_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block15_1_conv (Conv2D) (None, 14, 14, 128) 90112 conv4_block15_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block15_1_bn (BatchNormal (None, 14, 14, 128) 512 conv4_block15_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block15_1_relu (Activatio (None, 14, 14, 128) 0 conv4_block15_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block15_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block15_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block15_concat (Concatena (None, 14, 14, 736) 0 conv4_block14_concat[0][0]
conv4_block15_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block16_0_bn (BatchNormal (None, 14, 14, 736) 2944 conv4_block15_concat[0][0]
__________________________________________________________________________________________________
conv4_block16_0_relu (Activatio (None, 14, 14, 736) 0 conv4_block16_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block16_1_conv (Conv2D) (None, 14, 14, 128) 94208 conv4_block16_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block16_1_bn (BatchNormal (None, 14, 14, 128) 512 conv4_block16_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block16_1_relu (Activatio (None, 14, 14, 128) 0 conv4_block16_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block16_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block16_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block16_concat (Concatena (None, 14, 14, 768) 0 conv4_block15_concat[0][0]
conv4_block16_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block17_0_bn (BatchNormal (None, 14, 14, 768) 3072 conv4_block16_concat[0][0]
__________________________________________________________________________________________________
conv4_block17_0_relu (Activatio (None, 14, 14, 768) 0 conv4_block17_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block17_1_conv (Conv2D) (None, 14, 14, 128) 98304 conv4_block17_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block17_1_bn (BatchNormal (None, 14, 14, 128) 512 conv4_block17_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block17_1_relu (Activatio (None, 14, 14, 128) 0 conv4_block17_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block17_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block17_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block17_concat (Concatena (None, 14, 14, 800) 0 conv4_block16_concat[0][0]
conv4_block17_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block18_0_bn (BatchNormal (None, 14, 14, 800) 3200 conv4_block17_concat[0][0]
__________________________________________________________________________________________________
conv4_block18_0_relu (Activatio (None, 14, 14, 800) 0 conv4_block18_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block18_1_conv (Conv2D) (None, 14, 14, 128) 102400 conv4_block18_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block18_1_bn (BatchNormal (None, 14, 14, 128) 512 conv4_block18_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block18_1_relu (Activatio (None, 14, 14, 128) 0 conv4_block18_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block18_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block18_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block18_concat (Concatena (None, 14, 14, 832) 0 conv4_block17_concat[0][0]
conv4_block18_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block19_0_bn (BatchNormal (None, 14, 14, 832) 3328 conv4_block18_concat[0][0]
__________________________________________________________________________________________________
conv4_block19_0_relu (Activatio (None, 14, 14, 832) 0 conv4_block19_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block19_1_conv (Conv2D) (None, 14, 14, 128) 106496 conv4_block19_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block19_1_bn (BatchNormal (None, 14, 14, 128) 512 conv4_block19_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block19_1_relu (Activatio (None, 14, 14, 128) 0 conv4_block19_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block19_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block19_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block19_concat (Concatena (None, 14, 14, 864) 0 conv4_block18_concat[0][0]
conv4_block19_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block20_0_bn (BatchNormal (None, 14, 14, 864) 3456 conv4_block19_concat[0][0]
__________________________________________________________________________________________________
conv4_block20_0_relu (Activatio (None, 14, 14, 864) 0 conv4_block20_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block20_1_conv (Conv2D) (None, 14, 14, 128) 110592 conv4_block20_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block20_1_bn (BatchNormal (None, 14, 14, 128) 512 conv4_block20_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block20_1_relu (Activatio (None, 14, 14, 128) 0 conv4_block20_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block20_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block20_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block20_concat (Concatena (None, 14, 14, 896) 0 conv4_block19_concat[0][0]
conv4_block20_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block21_0_bn (BatchNormal (None, 14, 14, 896) 3584 conv4_block20_concat[0][0]
__________________________________________________________________________________________________
conv4_block21_0_relu (Activatio (None, 14, 14, 896) 0 conv4_block21_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block21_1_conv (Conv2D) (None, 14, 14, 128) 114688 conv4_block21_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block21_1_bn (BatchNormal (None, 14, 14, 128) 512 conv4_block21_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block21_1_relu (Activatio (None, 14, 14, 128) 0 conv4_block21_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block21_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block21_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block21_concat (Concatena (None, 14, 14, 928) 0 conv4_block20_concat[0][0]
conv4_block21_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block22_0_bn (BatchNormal (None, 14, 14, 928) 3712 conv4_block21_concat[0][0]
__________________________________________________________________________________________________
conv4_block22_0_relu (Activatio (None, 14, 14, 928) 0 conv4_block22_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block22_1_conv (Conv2D) (None, 14, 14, 128) 118784 conv4_block22_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block22_1_bn (BatchNormal (None, 14, 14, 128) 512 conv4_block22_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block22_1_relu (Activatio (None, 14, 14, 128) 0 conv4_block22_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block22_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block22_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block22_concat (Concatena (None, 14, 14, 960) 0 conv4_block21_concat[0][0]
conv4_block22_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block23_0_bn (BatchNormal (None, 14, 14, 960) 3840 conv4_block22_concat[0][0]
__________________________________________________________________________________________________
conv4_block23_0_relu (Activatio (None, 14, 14, 960) 0 conv4_block23_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block23_1_conv (Conv2D) (None, 14, 14, 128) 122880 conv4_block23_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block23_1_bn (BatchNormal (None, 14, 14, 128) 512 conv4_block23_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block23_1_relu (Activatio (None, 14, 14, 128) 0 conv4_block23_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block23_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block23_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block23_concat (Concatena (None, 14, 14, 992) 0 conv4_block22_concat[0][0]
conv4_block23_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block24_0_bn (BatchNormal (None, 14, 14, 992) 3968 conv4_block23_concat[0][0]
__________________________________________________________________________________________________
conv4_block24_0_relu (Activatio (None, 14, 14, 992) 0 conv4_block24_0_bn[0][0]
__________________________________________________________________________________________________
conv4_block24_1_conv (Conv2D) (None, 14, 14, 128) 126976 conv4_block24_0_relu[0][0]
__________________________________________________________________________________________________
conv4_block24_1_bn (BatchNormal (None, 14, 14, 128) 512 conv4_block24_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block24_1_relu (Activatio (None, 14, 14, 128) 0 conv4_block24_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block24_2_conv (Conv2D) (None, 14, 14, 32) 36864 conv4_block24_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block24_concat (Concatena (None, 14, 14, 1024) 0 conv4_block23_concat[0][0]
conv4_block24_2_conv[0][0]
__________________________________________________________________________________________________
pool4_bn (BatchNormalization) (None, 14, 14, 1024) 4096 conv4_block24_concat[0][0]
__________________________________________________________________________________________________
pool4_relu (Activation) (None, 14, 14, 1024) 0 pool4_bn[0][0]
__________________________________________________________________________________________________
pool4_conv (Conv2D) (None, 14, 14, 512) 524288 pool4_relu[0][0]
__________________________________________________________________________________________________
pool4_pool (AveragePooling2D) (None, 7, 7, 512) 0 pool4_conv[0][0]
__________________________________________________________________________________________________
conv5_block1_0_bn (BatchNormali (None, 7, 7, 512) 2048 pool4_pool[0][0]
__________________________________________________________________________________________________
conv5_block1_0_relu (Activation (None, 7, 7, 512) 0 conv5_block1_0_bn[0][0]
__________________________________________________________________________________________________
conv5_block1_1_conv (Conv2D) (None, 7, 7, 128) 65536 conv5_block1_0_relu[0][0]
__________________________________________________________________________________________________
conv5_block1_1_bn (BatchNormali (None, 7, 7, 128) 512 conv5_block1_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block1_1_relu (Activation (None, 7, 7, 128) 0 conv5_block1_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block1_2_conv (Conv2D) (None, 7, 7, 32) 36864 conv5_block1_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block1_concat (Concatenat (None, 7, 7, 544) 0 pool4_pool[0][0]
conv5_block1_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block2_0_bn (BatchNormali (None, 7, 7, 544) 2176 conv5_block1_concat[0][0]
__________________________________________________________________________________________________
conv5_block2_0_relu (Activation (None, 7, 7, 544) 0 conv5_block2_0_bn[0][0]
__________________________________________________________________________________________________
conv5_block2_1_conv (Conv2D) (None, 7, 7, 128) 69632 conv5_block2_0_relu[0][0]
__________________________________________________________________________________________________
conv5_block2_1_bn (BatchNormali (None, 7, 7, 128) 512 conv5_block2_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block2_1_relu (Activation (None, 7, 7, 128) 0 conv5_block2_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block2_2_conv (Conv2D) (None, 7, 7, 32) 36864 conv5_block2_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block2_concat (Concatenat (None, 7, 7, 576) 0 conv5_block1_concat[0][0]
conv5_block2_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block3_0_bn (BatchNormali (None, 7, 7, 576) 2304 conv5_block2_concat[0][0]
__________________________________________________________________________________________________
conv5_block3_0_relu (Activation (None, 7, 7, 576) 0 conv5_block3_0_bn[0][0]
__________________________________________________________________________________________________
conv5_block3_1_conv (Conv2D) (None, 7, 7, 128) 73728 conv5_block3_0_relu[0][0]
__________________________________________________________________________________________________
conv5_block3_1_bn (BatchNormali (None, 7, 7, 128) 512 conv5_block3_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block3_1_relu (Activation (None, 7, 7, 128) 0 conv5_block3_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block3_2_conv (Conv2D) (None, 7, 7, 32) 36864 conv5_block3_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block3_concat (Concatenat (None, 7, 7, 608) 0 conv5_block2_concat[0][0]
conv5_block3_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block4_0_bn (BatchNormali (None, 7, 7, 608) 2432 conv5_block3_concat[0][0]
__________________________________________________________________________________________________
conv5_block4_0_relu (Activation (None, 7, 7, 608) 0 conv5_block4_0_bn[0][0]
__________________________________________________________________________________________________
conv5_block4_1_conv (Conv2D) (None, 7, 7, 128) 77824 conv5_block4_0_relu[0][0]
__________________________________________________________________________________________________
conv5_block4_1_bn (BatchNormali (None, 7, 7, 128) 512 conv5_block4_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block4_1_relu (Activation (None, 7, 7, 128) 0 conv5_block4_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block4_2_conv (Conv2D) (None, 7, 7, 32) 36864 conv5_block4_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block4_concat (Concatenat (None, 7, 7, 640) 0 conv5_block3_concat[0][0]
conv5_block4_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block5_0_bn (BatchNormali (None, 7, 7, 640) 2560 conv5_block4_concat[0][0]
__________________________________________________________________________________________________
conv5_block5_0_relu (Activation (None, 7, 7, 640) 0 conv5_block5_0_bn[0][0]
__________________________________________________________________________________________________
conv5_block5_1_conv (Conv2D) (None, 7, 7, 128) 81920 conv5_block5_0_relu[0][0]
__________________________________________________________________________________________________
conv5_block5_1_bn (BatchNormali (None, 7, 7, 128) 512 conv5_block5_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block5_1_relu (Activation (None, 7, 7, 128) 0 conv5_block5_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block5_2_conv (Conv2D) (None, 7, 7, 32) 36864 conv5_block5_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block5_concat (Concatenat (None, 7, 7, 672) 0 conv5_block4_concat[0][0]
conv5_block5_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block6_0_bn (BatchNormali (None, 7, 7, 672) 2688 conv5_block5_concat[0][0]
__________________________________________________________________________________________________
conv5_block6_0_relu (Activation (None, 7, 7, 672) 0 conv5_block6_0_bn[0][0]
__________________________________________________________________________________________________
conv5_block6_1_conv (Conv2D) (None, 7, 7, 128) 86016 conv5_block6_0_relu[0][0]
__________________________________________________________________________________________________
conv5_block6_1_bn (BatchNormali (None, 7, 7, 128) 512 conv5_block6_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block6_1_relu (Activation (None, 7, 7, 128) 0 conv5_block6_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block6_2_conv (Conv2D) (None, 7, 7, 32) 36864 conv5_block6_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block6_concat (Concatenat (None, 7, 7, 704) 0 conv5_block5_concat[0][0]
conv5_block6_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block7_0_bn (BatchNormali (None, 7, 7, 704) 2816 conv5_block6_concat[0][0]
__________________________________________________________________________________________________
conv5_block7_0_relu (Activation (None, 7, 7, 704) 0 conv5_block7_0_bn[0][0]
__________________________________________________________________________________________________
conv5_block7_1_conv (Conv2D) (None, 7, 7, 128) 90112 conv5_block7_0_relu[0][0]
__________________________________________________________________________________________________
conv5_block7_1_bn (BatchNormali (None, 7, 7, 128) 512 conv5_block7_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block7_1_relu (Activation (None, 7, 7, 128) 0 conv5_block7_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block7_2_conv (Conv2D) (None, 7, 7, 32) 36864 conv5_block7_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block7_concat (Concatenat (None, 7, 7, 736) 0 conv5_block6_concat[0][0]
conv5_block7_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block8_0_bn (BatchNormali (None, 7, 7, 736) 2944 conv5_block7_concat[0][0]
__________________________________________________________________________________________________
conv5_block8_0_relu (Activation (None, 7, 7, 736) 0 conv5_block8_0_bn[0][0]
__________________________________________________________________________________________________
conv5_block8_1_conv (Conv2D) (None, 7, 7, 128) 94208 conv5_block8_0_relu[0][0]
__________________________________________________________________________________________________
conv5_block8_1_bn (BatchNormali (None, 7, 7, 128) 512 conv5_block8_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block8_1_relu (Activation (None, 7, 7, 128) 0 conv5_block8_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block8_2_conv (Conv2D) (None, 7, 7, 32) 36864 conv5_block8_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block8_concat (Concatenat (None, 7, 7, 768) 0 conv5_block7_concat[0][0]
conv5_block8_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block9_0_bn (BatchNormali (None, 7, 7, 768) 3072 conv5_block8_concat[0][0]
__________________________________________________________________________________________________
conv5_block9_0_relu (Activation (None, 7, 7, 768) 0 conv5_block9_0_bn[0][0]
__________________________________________________________________________________________________
conv5_block9_1_conv (Conv2D) (None, 7, 7, 128) 98304 conv5_block9_0_relu[0][0]
__________________________________________________________________________________________________
conv5_block9_1_bn (BatchNormali (None, 7, 7, 128) 512 conv5_block9_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block9_1_relu (Activation (None, 7, 7, 128) 0 conv5_block9_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block9_2_conv (Conv2D) (None, 7, 7, 32) 36864 conv5_block9_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block9_concat (Concatenat (None, 7, 7, 800) 0 conv5_block8_concat[0][0]
conv5_block9_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block10_0_bn (BatchNormal (None, 7, 7, 800) 3200 conv5_block9_concat[0][0]
__________________________________________________________________________________________________
conv5_block10_0_relu (Activatio (None, 7, 7, 800) 0 conv5_block10_0_bn[0][0]
__________________________________________________________________________________________________
conv5_block10_1_conv (Conv2D) (None, 7, 7, 128) 102400 conv5_block10_0_relu[0][0]
__________________________________________________________________________________________________
conv5_block10_1_bn (BatchNormal (None, 7, 7, 128) 512 conv5_block10_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block10_1_relu (Activatio (None, 7, 7, 128) 0 conv5_block10_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block10_2_conv (Conv2D) (None, 7, 7, 32) 36864 conv5_block10_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block10_concat (Concatena (None, 7, 7, 832) 0 conv5_block9_concat[0][0]
conv5_block10_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block11_0_bn (BatchNormal (None, 7, 7, 832) 3328 conv5_block10_concat[0][0]
__________________________________________________________________________________________________
conv5_block11_0_relu (Activatio (None, 7, 7, 832) 0 conv5_block11_0_bn[0][0]
__________________________________________________________________________________________________
conv5_block11_1_conv (Conv2D) (None, 7, 7, 128) 106496 conv5_block11_0_relu[0][0]
__________________________________________________________________________________________________
conv5_block11_1_bn (BatchNormal (None, 7, 7, 128) 512 conv5_block11_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block11_1_relu (Activatio (None, 7, 7, 128) 0 conv5_block11_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block11_2_conv (Conv2D) (None, 7, 7, 32) 36864 conv5_block11_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block11_concat (Concatena (None, 7, 7, 864) 0 conv5_block10_concat[0][0]
conv5_block11_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block12_0_bn (BatchNormal (None, 7, 7, 864) 3456 conv5_block11_concat[0][0]
__________________________________________________________________________________________________
conv5_block12_0_relu (Activatio (None, 7, 7, 864) 0 conv5_block12_0_bn[0][0]
__________________________________________________________________________________________________
conv5_block12_1_conv (Conv2D) (None, 7, 7, 128) 110592 conv5_block12_0_relu[0][0]
__________________________________________________________________________________________________
conv5_block12_1_bn (BatchNormal (None, 7, 7, 128) 512 conv5_block12_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block12_1_relu (Activatio (None, 7, 7, 128) 0 conv5_block12_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block12_2_conv (Conv2D) (None, 7, 7, 32) 36864 conv5_block12_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block12_concat (Concatena (None, 7, 7, 896) 0 conv5_block11_concat[0][0]
conv5_block12_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block13_0_bn (BatchNormal (None, 7, 7, 896) 3584 conv5_block12_concat[0][0]
__________________________________________________________________________________________________
conv5_block13_0_relu (Activatio (None, 7, 7, 896) 0 conv5_block13_0_bn[0][0]
__________________________________________________________________________________________________
conv5_block13_1_conv (Conv2D) (None, 7, 7, 128) 114688 conv5_block13_0_relu[0][0]
__________________________________________________________________________________________________
conv5_block13_1_bn (BatchNormal (None, 7, 7, 128) 512 conv5_block13_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block13_1_relu (Activatio (None, 7, 7, 128) 0 conv5_block13_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block13_2_conv (Conv2D) (None, 7, 7, 32) 36864 conv5_block13_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block13_concat (Concatena (None, 7, 7, 928) 0 conv5_block12_concat[0][0]
conv5_block13_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block14_0_bn (BatchNormal (None, 7, 7, 928) 3712 conv5_block13_concat[0][0]
__________________________________________________________________________________________________
conv5_block14_0_relu (Activatio (None, 7, 7, 928) 0 conv5_block14_0_bn[0][0]
__________________________________________________________________________________________________
conv5_block14_1_conv (Conv2D) (None, 7, 7, 128) 118784 conv5_block14_0_relu[0][0]
__________________________________________________________________________________________________
conv5_block14_1_bn (BatchNormal (None, 7, 7, 128) 512 conv5_block14_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block14_1_relu (Activatio (None, 7, 7, 128) 0 conv5_block14_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block14_2_conv (Conv2D) (None, 7, 7, 32) 36864 conv5_block14_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block14_concat (Concatena (None, 7, 7, 960) 0 conv5_block13_concat[0][0]
conv5_block14_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block15_0_bn (BatchNormal (None, 7, 7, 960) 3840 conv5_block14_concat[0][0]
__________________________________________________________________________________________________
conv5_block15_0_relu (Activatio (None, 7, 7, 960) 0 conv5_block15_0_bn[0][0]
__________________________________________________________________________________________________
conv5_block15_1_conv (Conv2D) (None, 7, 7, 128) 122880 conv5_block15_0_relu[0][0]
__________________________________________________________________________________________________
conv5_block15_1_bn (BatchNormal (None, 7, 7, 128) 512 conv5_block15_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block15_1_relu (Activatio (None, 7, 7, 128) 0 conv5_block15_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block15_2_conv (Conv2D) (None, 7, 7, 32) 36864 conv5_block15_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block15_concat (Concatena (None, 7, 7, 992) 0 conv5_block14_concat[0][0]
conv5_block15_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block16_0_bn (BatchNormal (None, 7, 7, 992) 3968 conv5_block15_concat[0][0]
__________________________________________________________________________________________________
conv5_block16_0_relu (Activatio (None, 7, 7, 992) 0 conv5_block16_0_bn[0][0]
__________________________________________________________________________________________________
conv5_block16_1_conv (Conv2D) (None, 7, 7, 128) 126976 conv5_block16_0_relu[0][0]
__________________________________________________________________________________________________
conv5_block16_1_bn (BatchNormal (None, 7, 7, 128) 512 conv5_block16_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block16_1_relu (Activatio (None, 7, 7, 128) 0 conv5_block16_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block16_2_conv (Conv2D) (None, 7, 7, 32) 36864 conv5_block16_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block16_concat (Concatena (None, 7, 7, 1024) 0 conv5_block15_concat[0][0]
conv5_block16_2_conv[0][0]
__________________________________________________________________________________________________
bn (BatchNormalization) (None, 7, 7, 1024) 4096 conv5_block16_concat[0][0]
__________________________________________________________________________________________________
relu (Activation) (None, 7, 7, 1024) 0 bn[0][0]
==================================================================================================
Total params: 7,037,504
Trainable params: 6,953,856
Non-trainable params: 83,648
__________________________________________________________________________________________________
import re
dense_base.trainable = False
for layer in dense_base.layers:
if bool(re.search('conv5',layer.name)):
layer.trainable = True
print(layer.name,": Trainable")
conv5_block1_0_bn : Trainable conv5_block1_0_relu : Trainable conv5_block1_1_conv : Trainable conv5_block1_1_bn : Trainable conv5_block1_1_relu : Trainable conv5_block1_2_conv : Trainable conv5_block1_concat : Trainable conv5_block2_0_bn : Trainable conv5_block2_0_relu : Trainable conv5_block2_1_conv : Trainable conv5_block2_1_bn : Trainable conv5_block2_1_relu : Trainable conv5_block2_2_conv : Trainable conv5_block2_concat : Trainable conv5_block3_0_bn : Trainable conv5_block3_0_relu : Trainable conv5_block3_1_conv : Trainable conv5_block3_1_bn : Trainable conv5_block3_1_relu : Trainable conv5_block3_2_conv : Trainable conv5_block3_concat : Trainable conv5_block4_0_bn : Trainable conv5_block4_0_relu : Trainable conv5_block4_1_conv : Trainable conv5_block4_1_bn : Trainable conv5_block4_1_relu : Trainable conv5_block4_2_conv : Trainable conv5_block4_concat : Trainable conv5_block5_0_bn : Trainable conv5_block5_0_relu : Trainable conv5_block5_1_conv : Trainable conv5_block5_1_bn : Trainable conv5_block5_1_relu : Trainable conv5_block5_2_conv : Trainable conv5_block5_concat : Trainable conv5_block6_0_bn : Trainable conv5_block6_0_relu : Trainable conv5_block6_1_conv : Trainable conv5_block6_1_bn : Trainable conv5_block6_1_relu : Trainable conv5_block6_2_conv : Trainable conv5_block6_concat : Trainable conv5_block7_0_bn : Trainable conv5_block7_0_relu : Trainable conv5_block7_1_conv : Trainable conv5_block7_1_bn : Trainable conv5_block7_1_relu : Trainable conv5_block7_2_conv : Trainable conv5_block7_concat : Trainable conv5_block8_0_bn : Trainable conv5_block8_0_relu : Trainable conv5_block8_1_conv : Trainable conv5_block8_1_bn : Trainable conv5_block8_1_relu : Trainable conv5_block8_2_conv : Trainable conv5_block8_concat : Trainable conv5_block9_0_bn : Trainable conv5_block9_0_relu : Trainable conv5_block9_1_conv : Trainable conv5_block9_1_bn : Trainable conv5_block9_1_relu : Trainable conv5_block9_2_conv : Trainable conv5_block9_concat : Trainable conv5_block10_0_bn : Trainable conv5_block10_0_relu : Trainable conv5_block10_1_conv : Trainable conv5_block10_1_bn : Trainable conv5_block10_1_relu : Trainable conv5_block10_2_conv : Trainable conv5_block10_concat : Trainable conv5_block11_0_bn : Trainable conv5_block11_0_relu : Trainable conv5_block11_1_conv : Trainable conv5_block11_1_bn : Trainable conv5_block11_1_relu : Trainable conv5_block11_2_conv : Trainable conv5_block11_concat : Trainable conv5_block12_0_bn : Trainable conv5_block12_0_relu : Trainable conv5_block12_1_conv : Trainable conv5_block12_1_bn : Trainable conv5_block12_1_relu : Trainable conv5_block12_2_conv : Trainable conv5_block12_concat : Trainable conv5_block13_0_bn : Trainable conv5_block13_0_relu : Trainable conv5_block13_1_conv : Trainable conv5_block13_1_bn : Trainable conv5_block13_1_relu : Trainable conv5_block13_2_conv : Trainable conv5_block13_concat : Trainable conv5_block14_0_bn : Trainable conv5_block14_0_relu : Trainable conv5_block14_1_conv : Trainable conv5_block14_1_bn : Trainable conv5_block14_1_relu : Trainable conv5_block14_2_conv : Trainable conv5_block14_concat : Trainable conv5_block15_0_bn : Trainable conv5_block15_0_relu : Trainable conv5_block15_1_conv : Trainable conv5_block15_1_bn : Trainable conv5_block15_1_relu : Trainable conv5_block15_2_conv : Trainable conv5_block15_concat : Trainable conv5_block16_0_bn : Trainable conv5_block16_0_relu : Trainable conv5_block16_1_conv : Trainable conv5_block16_1_bn : Trainable conv5_block16_1_relu : Trainable conv5_block16_2_conv : Trainable conv5_block16_concat : Trainable
model_D1 = Sequential()
model_D1.add( dense_base)
model_D1.add( Flatten())
model_D1.add( Dense(units=25, activation = 'relu' , input_dim = 7 * 7 * 1920))
model_D1.add( Dense(units=3, activation = 'softmax' ) )
model_D1.summary()
Model: "sequential_3" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= densenet121 (Functional) (None, 7, 7, 1024) 7037504 _________________________________________________________________ flatten_3 (Flatten) (None, 50176) 0 _________________________________________________________________ dense_6 (Dense) (None, 25) 1254425 _________________________________________________________________ dense_7 (Dense) (None, 3) 78 ================================================================= Total params: 8,292,007 Trainable params: 1,254,503 Non-trainable params: 7,037,504 _________________________________________________________________
model_D1.compile( optimizer = 'rmsprop', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
model_D1_history = model_D1.fit( training_data, validation_data = validation_data, epochs = 50, batch_size = 64, callbacks = [callback_earlystp] )
Epoch 1/50 81/81 [==============================] - 35s 349ms/step - loss: 4.8115 - accuracy: 0.6827 - precision_3: 0.6857 - recall_3: 0.6796 - val_loss: 0.4378 - val_accuracy: 0.9547 - val_precision_3: 0.9547 - val_recall_3: 0.9547 Epoch 2/50 81/81 [==============================] - 26s 320ms/step - loss: 0.2488 - accuracy: 0.9720 - precision_3: 0.9719 - recall_3: 0.9707 - val_loss: 0.1561 - val_accuracy: 0.9753 - val_precision_3: 0.9753 - val_recall_3: 0.9753 Epoch 3/50 81/81 [==============================] - 26s 319ms/step - loss: 0.1399 - accuracy: 0.9809 - precision_3: 0.9808 - recall_3: 0.9785 - val_loss: 0.0362 - val_accuracy: 0.9877 - val_precision_3: 0.9877 - val_recall_3: 0.9877 Epoch 4/50 81/81 [==============================] - 26s 316ms/step - loss: 0.2027 - accuracy: 0.9676 - precision_3: 0.9676 - recall_3: 0.9676 - val_loss: 0.2772 - val_accuracy: 0.9671 - val_precision_3: 0.9671 - val_recall_3: 0.9671 Epoch 5/50 81/81 [==============================] - 26s 322ms/step - loss: 0.1288 - accuracy: 0.9777 - precision_3: 0.9777 - recall_3: 0.9777 - val_loss: 0.0830 - val_accuracy: 0.9877 - val_precision_3: 0.9877 - val_recall_3: 0.9877 Epoch 6/50 81/81 [==============================] - 26s 320ms/step - loss: 0.1082 - accuracy: 0.9727 - precision_3: 0.9727 - recall_3: 0.9727 - val_loss: 0.1272 - val_accuracy: 0.9877 - val_precision_3: 0.9877 - val_recall_3: 0.9877 Epoch 7/50 81/81 [==============================] - 26s 319ms/step - loss: 0.0024 - accuracy: 0.9989 - precision_3: 0.9989 - recall_3: 0.9989 - val_loss: 0.1447 - val_accuracy: 0.9877 - val_precision_3: 0.9877 - val_recall_3: 0.9877 Epoch 8/50 81/81 [==============================] - 26s 316ms/step - loss: 3.9146e-04 - accuracy: 1.0000 - precision_3: 1.0000 - recall_3: 1.0000 - val_loss: 0.2958 - val_accuracy: 0.9835 - val_precision_3: 0.9835 - val_recall_3: 0.9835 Epoch 9/50 81/81 [==============================] - 26s 319ms/step - loss: 0.0396 - accuracy: 0.9986 - precision_3: 0.9986 - recall_3: 0.9986 - val_loss: 0.3812 - val_accuracy: 0.9835 - val_precision_3: 0.9835 - val_recall_3: 0.9835 Epoch 10/50 81/81 [==============================] - 26s 317ms/step - loss: 0.0020 - accuracy: 0.9993 - precision_3: 0.9993 - recall_3: 0.9993 - val_loss: 0.2231 - val_accuracy: 0.9877 - val_precision_3: 0.9877 - val_recall_3: 0.9877 Epoch 11/50 81/81 [==============================] - 26s 320ms/step - loss: 7.4729e-05 - accuracy: 1.0000 - precision_3: 1.0000 - recall_3: 1.0000 - val_loss: 1.8463 - val_accuracy: 0.9012 - val_precision_3: 0.9012 - val_recall_3: 0.9012 Epoch 12/50 81/81 [==============================] - 26s 318ms/step - loss: 0.0706 - accuracy: 0.9907 - precision_3: 0.9907 - recall_3: 0.9907 - val_loss: 0.4714 - val_accuracy: 0.9835 - val_precision_3: 0.9835 - val_recall_3: 0.9835 Epoch 13/50 81/81 [==============================] - 26s 324ms/step - loss: 0.0435 - accuracy: 0.9950 - precision_3: 0.9950 - recall_3: 0.9950 - val_loss: 0.0797 - val_accuracy: 0.9959 - val_precision_3: 0.9959 - val_recall_3: 0.9959 Epoch 14/50 81/81 [==============================] - 27s 336ms/step - loss: 0.0086 - accuracy: 0.9966 - precision_3: 0.9966 - recall_3: 0.9966 - val_loss: 0.1983 - val_accuracy: 0.9918 - val_precision_3: 0.9918 - val_recall_3: 0.9918 Epoch 15/50 81/81 [==============================] - 27s 335ms/step - loss: 0.0041 - accuracy: 0.9985 - precision_3: 0.9985 - recall_3: 0.9985 - val_loss: 0.3677 - val_accuracy: 0.9794 - val_precision_3: 0.9794 - val_recall_3: 0.9794 Epoch 16/50 81/81 [==============================] - 26s 325ms/step - loss: 2.5950e-06 - accuracy: 1.0000 - precision_3: 1.0000 - recall_3: 1.0000 - val_loss: 0.8464 - val_accuracy: 0.9465 - val_precision_3: 0.9465 - val_recall_3: 0.9465 Epoch 17/50 81/81 [==============================] - 26s 323ms/step - loss: 0.0077 - accuracy: 0.9985 - precision_3: 0.9985 - recall_3: 0.9985 - val_loss: 0.1470 - val_accuracy: 0.9918 - val_precision_3: 0.9918 - val_recall_3: 0.9918 Epoch 18/50 81/81 [==============================] - 26s 325ms/step - loss: 2.5739e-06 - accuracy: 1.0000 - precision_3: 1.0000 - recall_3: 1.0000 - val_loss: 0.1490 - val_accuracy: 0.9918 - val_precision_3: 0.9918 - val_recall_3: 0.9918 Epoch 00018: early stopping
plt.subplots_adjust(right=1.95, left=.005)
#plt.subplots_adjust(right=1.25,bottom=0.03, top=1.0)
plt.subplot(1,3,1)
plt.plot(model_D1_history.history['accuracy'])
plt.plot(model_D1_history.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(model_D1_history.history['precision_3'])
plt.plot(model_D1_history.history['val_precision_3'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
plt.subplot(1,3,3)
plt.plot(model_D1_history.history['recall_3'])
plt.plot(model_D1_history.history['val_recall_3'])
plt.ylabel('Recall')
plt.xlabel('')
plt.show()
test_loss, test_acc, test_precision, test_recall = model_D1.evaluate(testing_data)
print('%s %.2f' % ('validation_acc: ', test_acc*100.0 ))
print('%s %.2f' % ('validation_loss:', test_loss ))
print('%s %.2f' % ('validation_precision:', test_precision ))
print('%s %.2f' % ('validation_recall:', test_recall ))
27/27 [==============================] - 6s 227ms/step - loss: 0.0617 - accuracy: 0.9918 - precision_3: 0.9918 - recall_3: 0.9918 validation_acc: 99.18 validation_loss: 0.06 validation_precision: 0.99 validation_recall: 0.99
Between the two architectures, it appears the DenseNet121 performed far better than the VGG16, which performed only marginally better than our manually generated network. We will compare one more pre-trained model: the ResNet50.
ResNet50
resnet_base = ResNet50(weights='imagenet', include_top=False, input_shape=(224,224,3))
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/resnet/resnet50_weights_tf_dim_ordering_tf_kernels_notop.h5 94773248/94765736 [==============================] - 0s 0us/step
resnet_base.summary()
Model: "resnet50"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_5 (InputLayer) [(None, 224, 224, 3) 0
__________________________________________________________________________________________________
conv1_pad (ZeroPadding2D) (None, 230, 230, 3) 0 input_5[0][0]
__________________________________________________________________________________________________
conv1_conv (Conv2D) (None, 112, 112, 64) 9472 conv1_pad[0][0]
__________________________________________________________________________________________________
conv1_bn (BatchNormalization) (None, 112, 112, 64) 256 conv1_conv[0][0]
__________________________________________________________________________________________________
conv1_relu (Activation) (None, 112, 112, 64) 0 conv1_bn[0][0]
__________________________________________________________________________________________________
pool1_pad (ZeroPadding2D) (None, 114, 114, 64) 0 conv1_relu[0][0]
__________________________________________________________________________________________________
pool1_pool (MaxPooling2D) (None, 56, 56, 64) 0 pool1_pad[0][0]
__________________________________________________________________________________________________
conv2_block1_1_conv (Conv2D) (None, 56, 56, 64) 4160 pool1_pool[0][0]
__________________________________________________________________________________________________
conv2_block1_1_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block1_1_conv[0][0]
__________________________________________________________________________________________________
conv2_block1_1_relu (Activation (None, 56, 56, 64) 0 conv2_block1_1_bn[0][0]
__________________________________________________________________________________________________
conv2_block1_2_conv (Conv2D) (None, 56, 56, 64) 36928 conv2_block1_1_relu[0][0]
__________________________________________________________________________________________________
conv2_block1_2_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block1_2_conv[0][0]
__________________________________________________________________________________________________
conv2_block1_2_relu (Activation (None, 56, 56, 64) 0 conv2_block1_2_bn[0][0]
__________________________________________________________________________________________________
conv2_block1_0_conv (Conv2D) (None, 56, 56, 256) 16640 pool1_pool[0][0]
__________________________________________________________________________________________________
conv2_block1_3_conv (Conv2D) (None, 56, 56, 256) 16640 conv2_block1_2_relu[0][0]
__________________________________________________________________________________________________
conv2_block1_0_bn (BatchNormali (None, 56, 56, 256) 1024 conv2_block1_0_conv[0][0]
__________________________________________________________________________________________________
conv2_block1_3_bn (BatchNormali (None, 56, 56, 256) 1024 conv2_block1_3_conv[0][0]
__________________________________________________________________________________________________
conv2_block1_add (Add) (None, 56, 56, 256) 0 conv2_block1_0_bn[0][0]
conv2_block1_3_bn[0][0]
__________________________________________________________________________________________________
conv2_block1_out (Activation) (None, 56, 56, 256) 0 conv2_block1_add[0][0]
__________________________________________________________________________________________________
conv2_block2_1_conv (Conv2D) (None, 56, 56, 64) 16448 conv2_block1_out[0][0]
__________________________________________________________________________________________________
conv2_block2_1_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block2_1_conv[0][0]
__________________________________________________________________________________________________
conv2_block2_1_relu (Activation (None, 56, 56, 64) 0 conv2_block2_1_bn[0][0]
__________________________________________________________________________________________________
conv2_block2_2_conv (Conv2D) (None, 56, 56, 64) 36928 conv2_block2_1_relu[0][0]
__________________________________________________________________________________________________
conv2_block2_2_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block2_2_conv[0][0]
__________________________________________________________________________________________________
conv2_block2_2_relu (Activation (None, 56, 56, 64) 0 conv2_block2_2_bn[0][0]
__________________________________________________________________________________________________
conv2_block2_3_conv (Conv2D) (None, 56, 56, 256) 16640 conv2_block2_2_relu[0][0]
__________________________________________________________________________________________________
conv2_block2_3_bn (BatchNormali (None, 56, 56, 256) 1024 conv2_block2_3_conv[0][0]
__________________________________________________________________________________________________
conv2_block2_add (Add) (None, 56, 56, 256) 0 conv2_block1_out[0][0]
conv2_block2_3_bn[0][0]
__________________________________________________________________________________________________
conv2_block2_out (Activation) (None, 56, 56, 256) 0 conv2_block2_add[0][0]
__________________________________________________________________________________________________
conv2_block3_1_conv (Conv2D) (None, 56, 56, 64) 16448 conv2_block2_out[0][0]
__________________________________________________________________________________________________
conv2_block3_1_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block3_1_conv[0][0]
__________________________________________________________________________________________________
conv2_block3_1_relu (Activation (None, 56, 56, 64) 0 conv2_block3_1_bn[0][0]
__________________________________________________________________________________________________
conv2_block3_2_conv (Conv2D) (None, 56, 56, 64) 36928 conv2_block3_1_relu[0][0]
__________________________________________________________________________________________________
conv2_block3_2_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block3_2_conv[0][0]
__________________________________________________________________________________________________
conv2_block3_2_relu (Activation (None, 56, 56, 64) 0 conv2_block3_2_bn[0][0]
__________________________________________________________________________________________________
conv2_block3_3_conv (Conv2D) (None, 56, 56, 256) 16640 conv2_block3_2_relu[0][0]
__________________________________________________________________________________________________
conv2_block3_3_bn (BatchNormali (None, 56, 56, 256) 1024 conv2_block3_3_conv[0][0]
__________________________________________________________________________________________________
conv2_block3_add (Add) (None, 56, 56, 256) 0 conv2_block2_out[0][0]
conv2_block3_3_bn[0][0]
__________________________________________________________________________________________________
conv2_block3_out (Activation) (None, 56, 56, 256) 0 conv2_block3_add[0][0]
__________________________________________________________________________________________________
conv3_block1_1_conv (Conv2D) (None, 28, 28, 128) 32896 conv2_block3_out[0][0]
__________________________________________________________________________________________________
conv3_block1_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block1_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block1_1_relu (Activation (None, 28, 28, 128) 0 conv3_block1_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block1_2_conv (Conv2D) (None, 28, 28, 128) 147584 conv3_block1_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block1_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block1_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block1_2_relu (Activation (None, 28, 28, 128) 0 conv3_block1_2_bn[0][0]
__________________________________________________________________________________________________
conv3_block1_0_conv (Conv2D) (None, 28, 28, 512) 131584 conv2_block3_out[0][0]
__________________________________________________________________________________________________
conv3_block1_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block1_2_relu[0][0]
__________________________________________________________________________________________________
conv3_block1_0_bn (BatchNormali (None, 28, 28, 512) 2048 conv3_block1_0_conv[0][0]
__________________________________________________________________________________________________
conv3_block1_3_bn (BatchNormali (None, 28, 28, 512) 2048 conv3_block1_3_conv[0][0]
__________________________________________________________________________________________________
conv3_block1_add (Add) (None, 28, 28, 512) 0 conv3_block1_0_bn[0][0]
conv3_block1_3_bn[0][0]
__________________________________________________________________________________________________
conv3_block1_out (Activation) (None, 28, 28, 512) 0 conv3_block1_add[0][0]
__________________________________________________________________________________________________
conv3_block2_1_conv (Conv2D) (None, 28, 28, 128) 65664 conv3_block1_out[0][0]
__________________________________________________________________________________________________
conv3_block2_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block2_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block2_1_relu (Activation (None, 28, 28, 128) 0 conv3_block2_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block2_2_conv (Conv2D) (None, 28, 28, 128) 147584 conv3_block2_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block2_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block2_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block2_2_relu (Activation (None, 28, 28, 128) 0 conv3_block2_2_bn[0][0]
__________________________________________________________________________________________________
conv3_block2_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block2_2_relu[0][0]
__________________________________________________________________________________________________
conv3_block2_3_bn (BatchNormali (None, 28, 28, 512) 2048 conv3_block2_3_conv[0][0]
__________________________________________________________________________________________________
conv3_block2_add (Add) (None, 28, 28, 512) 0 conv3_block1_out[0][0]
conv3_block2_3_bn[0][0]
__________________________________________________________________________________________________
conv3_block2_out (Activation) (None, 28, 28, 512) 0 conv3_block2_add[0][0]
__________________________________________________________________________________________________
conv3_block3_1_conv (Conv2D) (None, 28, 28, 128) 65664 conv3_block2_out[0][0]
__________________________________________________________________________________________________
conv3_block3_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block3_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block3_1_relu (Activation (None, 28, 28, 128) 0 conv3_block3_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block3_2_conv (Conv2D) (None, 28, 28, 128) 147584 conv3_block3_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block3_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block3_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block3_2_relu (Activation (None, 28, 28, 128) 0 conv3_block3_2_bn[0][0]
__________________________________________________________________________________________________
conv3_block3_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block3_2_relu[0][0]
__________________________________________________________________________________________________
conv3_block3_3_bn (BatchNormali (None, 28, 28, 512) 2048 conv3_block3_3_conv[0][0]
__________________________________________________________________________________________________
conv3_block3_add (Add) (None, 28, 28, 512) 0 conv3_block2_out[0][0]
conv3_block3_3_bn[0][0]
__________________________________________________________________________________________________
conv3_block3_out (Activation) (None, 28, 28, 512) 0 conv3_block3_add[0][0]
__________________________________________________________________________________________________
conv3_block4_1_conv (Conv2D) (None, 28, 28, 128) 65664 conv3_block3_out[0][0]
__________________________________________________________________________________________________
conv3_block4_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block4_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block4_1_relu (Activation (None, 28, 28, 128) 0 conv3_block4_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block4_2_conv (Conv2D) (None, 28, 28, 128) 147584 conv3_block4_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block4_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block4_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block4_2_relu (Activation (None, 28, 28, 128) 0 conv3_block4_2_bn[0][0]
__________________________________________________________________________________________________
conv3_block4_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block4_2_relu[0][0]
__________________________________________________________________________________________________
conv3_block4_3_bn (BatchNormali (None, 28, 28, 512) 2048 conv3_block4_3_conv[0][0]
__________________________________________________________________________________________________
conv3_block4_add (Add) (None, 28, 28, 512) 0 conv3_block3_out[0][0]
conv3_block4_3_bn[0][0]
__________________________________________________________________________________________________
conv3_block4_out (Activation) (None, 28, 28, 512) 0 conv3_block4_add[0][0]
__________________________________________________________________________________________________
conv4_block1_1_conv (Conv2D) (None, 14, 14, 256) 131328 conv3_block4_out[0][0]
__________________________________________________________________________________________________
conv4_block1_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block1_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block1_1_relu (Activation (None, 14, 14, 256) 0 conv4_block1_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block1_2_conv (Conv2D) (None, 14, 14, 256) 590080 conv4_block1_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block1_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block1_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block1_2_relu (Activation (None, 14, 14, 256) 0 conv4_block1_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block1_0_conv (Conv2D) (None, 14, 14, 1024) 525312 conv3_block4_out[0][0]
__________________________________________________________________________________________________
conv4_block1_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block1_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block1_0_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block1_0_conv[0][0]
__________________________________________________________________________________________________
conv4_block1_3_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block1_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block1_add (Add) (None, 14, 14, 1024) 0 conv4_block1_0_bn[0][0]
conv4_block1_3_bn[0][0]
__________________________________________________________________________________________________
conv4_block1_out (Activation) (None, 14, 14, 1024) 0 conv4_block1_add[0][0]
__________________________________________________________________________________________________
conv4_block2_1_conv (Conv2D) (None, 14, 14, 256) 262400 conv4_block1_out[0][0]
__________________________________________________________________________________________________
conv4_block2_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block2_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block2_1_relu (Activation (None, 14, 14, 256) 0 conv4_block2_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block2_2_conv (Conv2D) (None, 14, 14, 256) 590080 conv4_block2_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block2_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block2_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block2_2_relu (Activation (None, 14, 14, 256) 0 conv4_block2_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block2_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block2_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block2_3_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block2_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block2_add (Add) (None, 14, 14, 1024) 0 conv4_block1_out[0][0]
conv4_block2_3_bn[0][0]
__________________________________________________________________________________________________
conv4_block2_out (Activation) (None, 14, 14, 1024) 0 conv4_block2_add[0][0]
__________________________________________________________________________________________________
conv4_block3_1_conv (Conv2D) (None, 14, 14, 256) 262400 conv4_block2_out[0][0]
__________________________________________________________________________________________________
conv4_block3_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block3_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block3_1_relu (Activation (None, 14, 14, 256) 0 conv4_block3_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block3_2_conv (Conv2D) (None, 14, 14, 256) 590080 conv4_block3_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block3_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block3_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block3_2_relu (Activation (None, 14, 14, 256) 0 conv4_block3_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block3_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block3_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block3_3_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block3_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block3_add (Add) (None, 14, 14, 1024) 0 conv4_block2_out[0][0]
conv4_block3_3_bn[0][0]
__________________________________________________________________________________________________
conv4_block3_out (Activation) (None, 14, 14, 1024) 0 conv4_block3_add[0][0]
__________________________________________________________________________________________________
conv4_block4_1_conv (Conv2D) (None, 14, 14, 256) 262400 conv4_block3_out[0][0]
__________________________________________________________________________________________________
conv4_block4_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block4_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block4_1_relu (Activation (None, 14, 14, 256) 0 conv4_block4_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block4_2_conv (Conv2D) (None, 14, 14, 256) 590080 conv4_block4_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block4_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block4_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block4_2_relu (Activation (None, 14, 14, 256) 0 conv4_block4_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block4_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block4_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block4_3_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block4_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block4_add (Add) (None, 14, 14, 1024) 0 conv4_block3_out[0][0]
conv4_block4_3_bn[0][0]
__________________________________________________________________________________________________
conv4_block4_out (Activation) (None, 14, 14, 1024) 0 conv4_block4_add[0][0]
__________________________________________________________________________________________________
conv4_block5_1_conv (Conv2D) (None, 14, 14, 256) 262400 conv4_block4_out[0][0]
__________________________________________________________________________________________________
conv4_block5_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block5_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block5_1_relu (Activation (None, 14, 14, 256) 0 conv4_block5_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block5_2_conv (Conv2D) (None, 14, 14, 256) 590080 conv4_block5_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block5_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block5_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block5_2_relu (Activation (None, 14, 14, 256) 0 conv4_block5_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block5_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block5_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block5_3_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block5_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block5_add (Add) (None, 14, 14, 1024) 0 conv4_block4_out[0][0]
conv4_block5_3_bn[0][0]
__________________________________________________________________________________________________
conv4_block5_out (Activation) (None, 14, 14, 1024) 0 conv4_block5_add[0][0]
__________________________________________________________________________________________________
conv4_block6_1_conv (Conv2D) (None, 14, 14, 256) 262400 conv4_block5_out[0][0]
__________________________________________________________________________________________________
conv4_block6_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block6_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block6_1_relu (Activation (None, 14, 14, 256) 0 conv4_block6_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block6_2_conv (Conv2D) (None, 14, 14, 256) 590080 conv4_block6_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block6_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block6_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block6_2_relu (Activation (None, 14, 14, 256) 0 conv4_block6_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block6_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block6_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block6_3_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block6_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block6_add (Add) (None, 14, 14, 1024) 0 conv4_block5_out[0][0]
conv4_block6_3_bn[0][0]
__________________________________________________________________________________________________
conv4_block6_out (Activation) (None, 14, 14, 1024) 0 conv4_block6_add[0][0]
__________________________________________________________________________________________________
conv5_block1_1_conv (Conv2D) (None, 7, 7, 512) 524800 conv4_block6_out[0][0]
__________________________________________________________________________________________________
conv5_block1_1_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block1_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block1_1_relu (Activation (None, 7, 7, 512) 0 conv5_block1_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block1_2_conv (Conv2D) (None, 7, 7, 512) 2359808 conv5_block1_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block1_2_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block1_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block1_2_relu (Activation (None, 7, 7, 512) 0 conv5_block1_2_bn[0][0]
__________________________________________________________________________________________________
conv5_block1_0_conv (Conv2D) (None, 7, 7, 2048) 2099200 conv4_block6_out[0][0]
__________________________________________________________________________________________________
conv5_block1_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 conv5_block1_2_relu[0][0]
__________________________________________________________________________________________________
conv5_block1_0_bn (BatchNormali (None, 7, 7, 2048) 8192 conv5_block1_0_conv[0][0]
__________________________________________________________________________________________________
conv5_block1_3_bn (BatchNormali (None, 7, 7, 2048) 8192 conv5_block1_3_conv[0][0]
__________________________________________________________________________________________________
conv5_block1_add (Add) (None, 7, 7, 2048) 0 conv5_block1_0_bn[0][0]
conv5_block1_3_bn[0][0]
__________________________________________________________________________________________________
conv5_block1_out (Activation) (None, 7, 7, 2048) 0 conv5_block1_add[0][0]
__________________________________________________________________________________________________
conv5_block2_1_conv (Conv2D) (None, 7, 7, 512) 1049088 conv5_block1_out[0][0]
__________________________________________________________________________________________________
conv5_block2_1_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block2_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block2_1_relu (Activation (None, 7, 7, 512) 0 conv5_block2_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block2_2_conv (Conv2D) (None, 7, 7, 512) 2359808 conv5_block2_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block2_2_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block2_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block2_2_relu (Activation (None, 7, 7, 512) 0 conv5_block2_2_bn[0][0]
__________________________________________________________________________________________________
conv5_block2_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 conv5_block2_2_relu[0][0]
__________________________________________________________________________________________________
conv5_block2_3_bn (BatchNormali (None, 7, 7, 2048) 8192 conv5_block2_3_conv[0][0]
__________________________________________________________________________________________________
conv5_block2_add (Add) (None, 7, 7, 2048) 0 conv5_block1_out[0][0]
conv5_block2_3_bn[0][0]
__________________________________________________________________________________________________
conv5_block2_out (Activation) (None, 7, 7, 2048) 0 conv5_block2_add[0][0]
__________________________________________________________________________________________________
conv5_block3_1_conv (Conv2D) (None, 7, 7, 512) 1049088 conv5_block2_out[0][0]
__________________________________________________________________________________________________
conv5_block3_1_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block3_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block3_1_relu (Activation (None, 7, 7, 512) 0 conv5_block3_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block3_2_conv (Conv2D) (None, 7, 7, 512) 2359808 conv5_block3_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block3_2_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block3_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block3_2_relu (Activation (None, 7, 7, 512) 0 conv5_block3_2_bn[0][0]
__________________________________________________________________________________________________
conv5_block3_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 conv5_block3_2_relu[0][0]
__________________________________________________________________________________________________
conv5_block3_3_bn (BatchNormali (None, 7, 7, 2048) 8192 conv5_block3_3_conv[0][0]
__________________________________________________________________________________________________
conv5_block3_add (Add) (None, 7, 7, 2048) 0 conv5_block2_out[0][0]
conv5_block3_3_bn[0][0]
__________________________________________________________________________________________________
conv5_block3_out (Activation) (None, 7, 7, 2048) 0 conv5_block3_add[0][0]
==================================================================================================
Total params: 23,587,712
Trainable params: 23,534,592
Non-trainable params: 53,120
__________________________________________________________________________________________________
import re
resnet_base.trainable = False
for layer in resnet_base.layers:
if bool(re.search('conv5',layer.name)):
layer.trainable = True
print(layer.name,": Trainable")
conv5_block1_1_conv : Trainable conv5_block1_1_bn : Trainable conv5_block1_1_relu : Trainable conv5_block1_2_conv : Trainable conv5_block1_2_bn : Trainable conv5_block1_2_relu : Trainable conv5_block1_0_conv : Trainable conv5_block1_3_conv : Trainable conv5_block1_0_bn : Trainable conv5_block1_3_bn : Trainable conv5_block1_add : Trainable conv5_block1_out : Trainable conv5_block2_1_conv : Trainable conv5_block2_1_bn : Trainable conv5_block2_1_relu : Trainable conv5_block2_2_conv : Trainable conv5_block2_2_bn : Trainable conv5_block2_2_relu : Trainable conv5_block2_3_conv : Trainable conv5_block2_3_bn : Trainable conv5_block2_add : Trainable conv5_block2_out : Trainable conv5_block3_1_conv : Trainable conv5_block3_1_bn : Trainable conv5_block3_1_relu : Trainable conv5_block3_2_conv : Trainable conv5_block3_2_bn : Trainable conv5_block3_2_relu : Trainable conv5_block3_3_conv : Trainable conv5_block3_3_bn : Trainable conv5_block3_add : Trainable conv5_block3_out : Trainable
model_R1 = Sequential()
model_R1.add( resnet_base)
model_R1.add( Flatten())
model_R1.add( Dense(units=25, activation = 'relu' , input_dim = 7 * 7 * 2048))
model_R1.add( Dense(units=3, activation = 'softmax' ) )
model_R1.summary()
Model: "sequential_5" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= resnet50 (Functional) (None, 7, 7, 2048) 23587712 _________________________________________________________________ flatten_5 (Flatten) (None, 100352) 0 _________________________________________________________________ dense_10 (Dense) (None, 25) 2508825 _________________________________________________________________ dense_11 (Dense) (None, 3) 78 ================================================================= Total params: 26,096,615 Trainable params: 2,508,903 Non-trainable params: 23,587,712 _________________________________________________________________
model_R1.compile( optimizer = 'rmsprop', loss = 'categorical_crossentropy', metrics = [ 'accuracy', Precision(), Recall()] )
model_R1_history = model_R1.fit( training_data, validation_data = validation_data, epochs = 50, batch_size = 64, callbacks = [callback_earlystp] )
Epoch 1/50 81/81 [==============================] - 31s 339ms/step - loss: 14.6141 - accuracy: 0.5328 - precision_5: 0.5530 - recall_5: 0.4925 - val_loss: 2.5892 - val_accuracy: 0.3292 - val_precision_5: 0.3292 - val_recall_5: 0.3292 Epoch 2/50 81/81 [==============================] - 26s 322ms/step - loss: 1.2046 - accuracy: 0.6635 - precision_5: 0.7142 - recall_5: 0.4843 - val_loss: 1.8022 - val_accuracy: 0.3457 - val_precision_5: 0.3636 - val_recall_5: 0.3457 Epoch 3/50 81/81 [==============================] - 26s 321ms/step - loss: 0.8807 - accuracy: 0.6979 - precision_5: 0.8121 - recall_5: 0.6308 - val_loss: 5.4017 - val_accuracy: 0.3498 - val_precision_5: 0.3602 - val_recall_5: 0.3498 Epoch 4/50 81/81 [==============================] - 26s 325ms/step - loss: 1.1895 - accuracy: 0.6991 - precision_5: 0.7923 - recall_5: 0.6408 - val_loss: 1.4986 - val_accuracy: 0.5514 - val_precision_5: 0.5523 - val_recall_5: 0.5432 Epoch 5/50 81/81 [==============================] - 26s 325ms/step - loss: 0.6905 - accuracy: 0.7932 - precision_5: 0.8432 - recall_5: 0.6978 - val_loss: 0.9923 - val_accuracy: 0.6831 - val_precision_5: 0.7411 - val_recall_5: 0.6008 Epoch 6/50 81/81 [==============================] - 26s 319ms/step - loss: 0.6061 - accuracy: 0.8266 - precision_5: 0.8862 - recall_5: 0.7602 - val_loss: 0.9260 - val_accuracy: 0.7819 - val_precision_5: 0.8018 - val_recall_5: 0.7325 Epoch 7/50 81/81 [==============================] - 26s 322ms/step - loss: 0.4349 - accuracy: 0.8392 - precision_5: 0.8784 - recall_5: 0.7647 - val_loss: 0.9249 - val_accuracy: 0.8025 - val_precision_5: 0.8732 - val_recall_5: 0.7366 Epoch 8/50 81/81 [==============================] - 26s 322ms/step - loss: 0.3797 - accuracy: 0.8572 - precision_5: 0.9071 - recall_5: 0.8157 - val_loss: 1.0182 - val_accuracy: 0.8272 - val_precision_5: 0.8419 - val_recall_5: 0.8107 Epoch 9/50 81/81 [==============================] - 26s 321ms/step - loss: 0.4715 - accuracy: 0.8614 - precision_5: 0.8919 - recall_5: 0.8197 - val_loss: 0.8817 - val_accuracy: 0.8436 - val_precision_5: 0.8523 - val_recall_5: 0.8313 Epoch 10/50 81/81 [==============================] - 26s 324ms/step - loss: 0.3037 - accuracy: 0.9199 - precision_5: 0.9436 - recall_5: 0.8999 - val_loss: 0.9945 - val_accuracy: 0.8436 - val_precision_5: 0.8673 - val_recall_5: 0.8066 Epoch 11/50 81/81 [==============================] - 26s 321ms/step - loss: 0.2971 - accuracy: 0.9175 - precision_5: 0.9446 - recall_5: 0.8865 - val_loss: 1.1544 - val_accuracy: 0.8354 - val_precision_5: 0.8419 - val_recall_5: 0.8107 Epoch 12/50 81/81 [==============================] - 26s 323ms/step - loss: 0.3331 - accuracy: 0.9269 - precision_5: 0.9542 - recall_5: 0.9091 - val_loss: 1.2145 - val_accuracy: 0.8519 - val_precision_5: 0.8767 - val_recall_5: 0.8189 Epoch 13/50 81/81 [==============================] - 26s 313ms/step - loss: 0.1643 - accuracy: 0.9260 - precision_5: 0.9692 - recall_5: 0.9061 - val_loss: 1.6019 - val_accuracy: 0.8230 - val_precision_5: 0.8193 - val_recall_5: 0.8025 Epoch 14/50 81/81 [==============================] - 26s 325ms/step - loss: 0.2570 - accuracy: 0.9062 - precision_5: 0.9398 - recall_5: 0.8877 - val_loss: 1.8115 - val_accuracy: 0.8519 - val_precision_5: 0.8571 - val_recall_5: 0.8395 Epoch 15/50 81/81 [==============================] - 26s 322ms/step - loss: 0.1835 - accuracy: 0.9248 - precision_5: 0.9709 - recall_5: 0.8993 - val_loss: 1.5404 - val_accuracy: 0.8395 - val_precision_5: 0.8481 - val_recall_5: 0.8272 Epoch 16/50 81/81 [==============================] - 26s 323ms/step - loss: 0.3429 - accuracy: 0.9247 - precision_5: 0.9463 - recall_5: 0.9156 - val_loss: 1.8778 - val_accuracy: 0.8189 - val_precision_5: 0.8250 - val_recall_5: 0.8148 Epoch 17/50 81/81 [==============================] - 26s 325ms/step - loss: 0.2569 - accuracy: 0.9288 - precision_5: 0.9465 - recall_5: 0.9215 - val_loss: 2.6844 - val_accuracy: 0.8395 - val_precision_5: 0.8430 - val_recall_5: 0.8395 Epoch 18/50 81/81 [==============================] - 26s 317ms/step - loss: 0.1379 - accuracy: 0.9567 - precision_5: 0.9675 - recall_5: 0.9509 - val_loss: 2.2401 - val_accuracy: 0.7531 - val_precision_5: 0.7826 - val_recall_5: 0.7407 Epoch 19/50 81/81 [==============================] - 26s 324ms/step - loss: 0.2280 - accuracy: 0.9509 - precision_5: 0.9621 - recall_5: 0.9419 - val_loss: 2.3554 - val_accuracy: 0.7819 - val_precision_5: 0.7819 - val_recall_5: 0.7819 Epoch 20/50 81/81 [==============================] - 26s 326ms/step - loss: 0.1080 - accuracy: 0.9621 - precision_5: 0.9649 - recall_5: 0.9595 - val_loss: 1.8919 - val_accuracy: 0.8313 - val_precision_5: 0.8382 - val_recall_5: 0.8313 Epoch 21/50 81/81 [==============================] - 26s 323ms/step - loss: 0.0903 - accuracy: 0.9711 - precision_5: 0.9742 - recall_5: 0.9653 - val_loss: 2.6306 - val_accuracy: 0.7366 - val_precision_5: 0.7564 - val_recall_5: 0.7284 Epoch 22/50 81/81 [==============================] - 26s 325ms/step - loss: 0.1015 - accuracy: 0.9729 - precision_5: 0.9862 - recall_5: 0.9576 - val_loss: 1.6133 - val_accuracy: 0.8436 - val_precision_5: 0.8536 - val_recall_5: 0.8395 Epoch 23/50 81/81 [==============================] - 26s 327ms/step - loss: 0.0790 - accuracy: 0.9701 - precision_5: 0.9814 - recall_5: 0.9670 - val_loss: 2.0802 - val_accuracy: 0.8107 - val_precision_5: 0.8140 - val_recall_5: 0.8107 Epoch 24/50 81/81 [==============================] - 26s 322ms/step - loss: 0.0427 - accuracy: 0.9849 - precision_5: 0.9921 - recall_5: 0.9846 - val_loss: 3.6203 - val_accuracy: 0.8395 - val_precision_5: 0.8395 - val_recall_5: 0.8395 Epoch 00024: early stopping
plt.subplots_adjust(right=1.95, left=.03)
#plt.subplots_adjust(right=1.25,bottom=0.03, top=1.0)
plt.subplot(1,3,1)
plt.plot(model_R1_history.history['accuracy'])
plt.plot(model_R1_history.history['val_accuracy'])
plt.ylabel('Accuracy')
plt.xlabel('')
plt.legend(['training','validation'], loc="lower right")
plt.subplot(1,3,2)
plt.plot(model_R1_history.history['precision_5'])
plt.plot(model_R1_history.history['val_precision_5'])
plt.ylabel('Precision')
plt.xlabel('Epoch')
plt.subplot(1,3,3)
plt.plot(model_R1_history.history['recall_5'])
plt.plot(model_R1_history.history['val_recall_5'])
plt.ylabel('Recall')
plt.xlabel('')
plt.show()
test_loss, test_acc, test_precision, test_recall = model_R1.evaluate(testing_data)
print('%s %.2f' % ('validation_acc: ', test_acc*100.0 ))
print('%s %.2f' % ('validation_loss:', test_loss ))
print('%s %.2f' % ('validation_precision:', test_precision ))
print('%s %.2f' % ('validation_recall:', test_recall ))
27/27 [==============================] - 6s 231ms/step - loss: 3.3572 - accuracy: 0.7737 - precision_5: 0.7737 - recall_5: 0.7737 validation_acc: 77.37 validation_loss: 3.36 validation_precision: 0.77 validation_recall: 0.77
Compared to both the VGG16 and DenseNet121 networks, the ResNet50 has appeared to overfit the data when given the same fully connected network as the other two.